Awesome Knowledge Distillation
-
Updated
Dec 20, 2023
Awesome Knowledge Distillation
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks. In NeurIPS 2020 workshop.
irresponsible innovation. Try now at https://chat.dev/
[ICML2023] Revisiting Data-Free Knowledge Distillation with Poisoned Teachers
Distillation Knowledge for training Multi-exit Model
An R package providing functions for interpreting and distilling machine learning models
Easily generate synthetic data for classification tasks using LLMs
It is envisaged to eliminate these light constituents by distillation (flash or stripping). A preliminary study of the operating conditions of the process can be done in pseudo-binary: we assimilate the C7 cut to n-heptane and the light ones to ethane. We wish to construct the diagrams [T-x-y] and [x-y], [h-x-y] of the ethane-n-heptane binary u…
[ICML 2024]Exploration and Anti-exploration with Distributional Random Network Distillation
Code Reproduction of the essay Distillation Decision Tree
Improving Typhoon Center Location Models by Augmented Typhoon Image and Distillation Methods
CISPA Summer Internship
Add a description, image, and links to the distillation-model topic page so that developers can more easily learn about it.
To associate your repository with the distillation-model topic, visit your repo's landing page and select "manage topics."