Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
-
Updated
Jun 7, 2024
Gather research papers, corresponding codes (if having), reading notes and any other related materials about Hot🔥🔥🔥 fields in Computer Vision based on Deep Learning.
Decoupled Kullback-Leibler Divergence Loss (DKL)
SOTA low-bit LLM quantization (INT8/FP8/INT4/FP4/NF4) & sparsity; leading model compression techniques on TensorFlow, PyTorch, and ONNX Runtime
A treasure chest for visual classification and recognition powered by PaddlePaddle
Object-Completion Tools for X-Ray Distillation framework
A curated list for Efficient Large Language Models
OpenMMLab Model Compression Toolbox and Benchmark.
The code and dataset of paper *Multi-View Fusion and Distillation for Subgrade Distresses Detection based on 3D-GPR*
A coding-free framework built on PyTorch for reproducible deep learning studies. 🏆25 knowledge distillation methods presented at CVPR, ICLR, ECCV, NeurIPS, ICCV, etc are implemented so far. 🎁 Trained models, training logs and configurations are available for ensuring the reproducibiliy and benchmark.
multi-teacher cross-modal knowledge distilaltion for unimodal brain tumor segmentation
The implementation code of our paper "Learning Generalizable Models for Vehicle Routing Problems via Knowledge Distillation", accepted at NeurIPS2022.
Collection of AWESOME vision-language models for vision tasks
[CVPR 2024] Official PyTorch Code for "PromptKD: Unsupervised Prompt Distillation for Vision-Language Models"
AI book for everyone
An Extendible (General) Continual Learning Framework based on Pytorch - official codebase of Dark Experience for General Continual Learning
Awesome Knowledge Distillation
A curated list of awesome NLP, Computer Vision, Model Compression, XAI, Reinforcement Learning, Security etc Paper
[arXiv'24] The official implementation code of LLM-ESR.
Deep Multimodal Guidance for Medical Image Classification: https://arxiv.org/pdf/2203.05683.pdf
Add a description, image, and links to the knowledge-distillation topic page so that developers can more easily learn about it.
To associate your repository with the knowledge-distillation topic, visit your repo's landing page and select "manage topics."