Generative Pre-trained Transformer in PyTorch
-
Updated
Jun 7, 2024 - Python
Generative Pre-trained Transformer in PyTorch
PyTorch/XLA integration with JetStream (https://github.com/google/JetStream) for LLM inference"
Reference implementation of "Softmax Attention with Constant Cost per Token" (Heinsen, 2024)
A compilation of the best multi-agent papers
🚀🚀🚀 A collection of some awesome public YOLO object detection series projects.
A solid foundational understanding of XAI, primarily emphasizing how XAI methodologies can expose latent biases in datasets and reveal valuable insights.
A PyTorch library for all things Reinforcement Learning (RL) for Combinatorial Optimization (CO)
Visualizing the attention of vision-language models
Unified-modal Salient Object Detection via Adaptive Prompt Learning
Alignment-Free RGBT Salient Object Detection: Semantics-guided Asymmetric Correlation Network and A Unified Benchmark
A collection of memory efficient attention operators implemented in the Triton language.
[ICML 2024] Outlier-Efficient Hopfield Layers for Large Transformer-Based Models
A Baby Llama model
Attention-based Adaptive filter designing for keyword classification
Pytorch implementation of various token mixers; Attention Mechanisms, MLP, and etc for understanding computer vision papers and other tasks.
Official implementation of our paper "Diving Deep into Regions: Exploiting Regional Information Transformer for Single Image Deraining."
Official Implementation of SEA: Sparse Linear Attention with Estimated Attention Mask (ICLR 2024)
Keras beit,caformer,CMT,CoAtNet,convnext,davit,dino,efficientdet,edgenext,efficientformer,efficientnet,eva,fasternet,fastervit,fastvit,flexivit,gcvit,ghostnet,gpvit,hornet,hiera,iformer,inceptionnext,lcnet,levit,maxvit,mobilevit,moganet,nat,nfnets,pvt,swin,tinynet,tinyvit,uniformer,volo,vanillanet,yolor,yolov7,yolov8,yolox,gpt2,llama2, alias kecam
This is the project repo associated with the paper "Disentangling and Integrating Relational and Sensory Information in Transformer Architectures" by Awni Altabaa, John Lafferty
Add a description, image, and links to the attention topic page so that developers can more easily learn about it.
To associate your repository with the attention topic, visit your repo's landing page and select "manage topics."