Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
-
Updated
May 31, 2024 - Python
Modality-Agnostic Learning for Medical Image Segmentation Using Multi-modality Self-distillation
[ACL 2024] The official codebase for the paper "Self-Distillation Bridges Distribution Gap in Language Model Fine-tuning".
Deep Hash Distillation for Image Retrieval - ECCV 2022
Self supervised learning through self distillation with no labels (DINO) with Vision Transformers on the PCAM dataset.
This repository collects papers for "A Survey on Knowledge Distillation of Large Language Models". We break down KD into Knowledge Elicitation and Distillation Algorithms, and explore the Skill & Vertical Distillation of LLMs.
Pytorch implementation of "Emerging Properties in Self-Supervised Vision Transformers" (a.k.a. DINO)
Bayesian Optimization Meets Self-Distillation, ICCV 2023
A generalized self-supervised training paradigm for unimodal and multimodal alignment and fusion.
Official implementation of Self-Distillation for Gaussian Processes
A minimalist unofficial implementation of "Self-Distillation from the Last Mini-Batch for Consistency Regularization"
Self-Distillation and Knowledge Distillation Experiments with PyTorch.
A pytorch implementation of paper 'Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation', https://arxiv.org/abs/1905.08094
Self-Distillation with weighted ground-truth targets; ResNet and Kernel Ridge Regression
(Unofficial) Data-Distortion Guided Self-Distillation for Deep Neural Networks (AAAI 2019)
Add a description, image, and links to the self-distillation topic page so that developers can more easily learn about it.
To associate your repository with the self-distillation topic, visit your repo's landing page and select "manage topics."