Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper
-
Updated
Jun 10, 2024 - Python
Implementation of the LongRoPE: Extending LLM Context Window Beyond 2 Million Tokens Paper
PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"
Attention is all you need implementation
Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
PyTorch implementation of GPT from scratch
GPT-Mini is a small-scale implementation of a GPT language model using PyTorch, based on the "Attention Is All You Need" paper.
Pytorch Transformer Implementation
Transformers without Tears: Improving the Normalization of Self-Attention
[CVPR 2024] Official implementation of CVPR 2024 paper: "Inversion-Free Image Editing with Natural Language"
Sequence Parallel Attention for Long Context LLM Model Training and Inference
Project Name: AdaViT | PyTorch Lightning, Python
Centrale-NLP-Public-Ressources : This repository is about the NLP class 2023/2024
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
Pytorch implementation of the models RT-1-X and RT-2-X from the paper: "Open X-Embodiment: Robotic Learning Datasets and RT-X Models"
Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling
Implementation of Liquid Nets in Pytorch
Implementation of the model "AudioFlamingo" from the paper: "Audio Flamingo: A Novel Audio Language Model with Few-Shot Learning and Dialogue Abilities"
Implementation of the transformer from the paper: "Real-World Humanoid Locomotion with Reinforcement Learning"
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with Sparse Transformers"
My implementation of Kosmos2.5 from the paper: "KOSMOS-2.5: A Multimodal Literate Model"
Add a description, image, and links to the attention-is-all-you-need topic page so that developers can more easily learn about it.
To associate your repository with the attention-is-all-you-need topic, visit your repo's landing page and select "manage topics."