🎓Automatically Update CV Papers Daily using Github Actions (Update Every 24th hours)
-
Updated
May 19, 2024 - Python
🎓Automatically Update CV Papers Daily using Github Actions (Update Every 24th hours)
Personal Project: MPP-Qwen14B(Multimodal Pipeline Parallel-Qwen14B). Don't let the poverty limit your imagination! Train your own 14B LLaVA-like MLLM on RTX3090/4090 24GB.
This collection of notebooks is based on the Dive into Deep Learning Book. All of the notes are written in Pytorch and the d2l/torch library
Saprot: Protein Language Model with Structural Alphabet
Llama中文社区,Llama3在线体验和微调模型已开放,实时汇总最新Llama3学习资料,已将所有代码更新适配Llama3,构建最好的中文Llama大模型,完全开源可商用
Official Repository for the Uni-Mol Series Methods
Code for the paper: "Using Pre-training and Interaction Modeling for ancestry-specific disease prediction using multiomics data from the UK Biobank"
[NeurIPS2022] Egocentric Video-Language Pretraining
[ICCV2023] UniVTG: Towards Unified Video-Language Temporal Grounding
Official implementation of Matrix Variational Masked Autoencoder (M-MAE) for ICML paper "Information Flow in Self-Supervised Learning" (https://arxiv.org/abs/2309.17281)
Official implementation of ICML 2024 paper "Matrix Information Theory for Self-supervised Learning" (https://arxiv.org/abs/2305.17326)
Taught by AI genius Andrew NG, this course entails the cutting edge topics such as, How generative AI works including what it can and can't do, Common uses cases such as Reading, Writing, and Chatting, Life Cycle of GenAI projects, Advanced Technology options such as RAG, Fine tunning, and Pre-Training, Implications of GenAI on business & Society.
Code repository for the conference paper "Organoid Segmentation Using Self-Supervised Learning: How Complex Should the Pretext Task Be?" published and presented at the International Conference on Biomedical and Bioinformatics Engineering (ICBBE) 2023.
PonderV2: Pave the Way for 3D Foundation Model with A Universal Pre-training Paradigm
Very incomplete right now, pretrained ARGVAET system for generating, classifying, and predicting the properties of molecules. I couldn't upload the dataset or checkpoints due to size constraints.
Official repository of OFA (ICML 2022). Paper: OFA: Unifying Architectures, Tasks, and Modalities Through a Simple Sequence-to-Sequence Learning Framework
Customized Pretraining for NLG Tasks
Benchmarking framework for protein representation learning. Includes a large number of pre-training and downstream task datasets, models and training/task utilities. (ICLR 2024)
Language Modeling Research Hub, a comprehensive compendium for enthusiasts and scholars delving into the fascinating realm of language models (LMs), with a particular focus on large language models (LLMs)
Add a description, image, and links to the pretraining topic page so that developers can more easily learn about it.
To associate your repository with the pretraining topic, visit your repo's landing page and select "manage topics."