💍 Efficient tensor decomposition-based filter pruning
-
Updated
May 21, 2024 - Jupyter Notebook
💍 Efficient tensor decomposition-based filter pruning
[CVPR 2023] Towards Any Structural Pruning; LLMs / SAM / Diffusion / Transformers / YOLOv8 / CNNs
A simple and effective LLM pruning approach.
Collection of recent methods on (deep) neural network compression and acceleration.
[NeurIPS 2023] Structural Pruning for Diffusion Models
[TPAMI 2024] This is the official repository for our paper: ''Pruning Self-attentions into Convolutional Layers in Single Path''.
Reimplementation of Sparse Variational Dropout in Keras-Core/Keras 3.0
This repository contains a Pytorch implementation of the paper "The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks" by Jonathan Frankle and Michael Carbin that can be easily adapted to any model/dataset.
[ICLR'23] Trainability Preserving Neural Pruning (PyTorch)
In this repository using the sparse training, group channel pruning and knowledge distilling for YOLOV4,
PAGCP for the compression of YOLOv5
[Preprint] Why is the State of Neural Network Pruning so Confusing? On the Fairness, Comparison Setup, and Trainability in Network Pruning
[TPAMI 2022, NeurIPS 2020] Code release for "Deep Multimodal Fusion by Channel Exchanging"
Counting currency from video using RepNet as a base model.
The official code for our ACCV2022 poster paper: Network Pruning via Feature Shift Minimization.
Code for the project "SNIP: Single-Shot Network Pruning"
Pytorch implementation of our paper (TNNLS) -- Pruning Networks with Cross-Layer Ranking & k-Reciprocal Nearest Filters
(CVPR 2021, Oral) Dynamic Slimmable Network
[NIPS 2016] Learning Structured Sparsity in Deep Neural Networks
Add a description, image, and links to the network-pruning topic page so that developers can more easily learn about it.
To associate your repository with the network-pruning topic, visit your repo's landing page and select "manage topics."