AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
-
Updated
Feb 26, 2025 - Python
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Official PyTorch implementation of "A Comprehensive Overhaul of Feature Distillation" (ICCV 2019)
Model Compression Toolkit (MCT) is an open source project for neural network model optimization under efficient, constrained hardware. This project provides researchers, developers, and engineers advanced quantization and compression tools for deploying state-of-the-art neural networks.
Group Fisher Pruning for Practical Network Compression(ICML2021)
Using ideas from product quantization for state-of-the-art neural network compression.
Knowledge Transfer via Distillation of Activation Boundaries Formed by Hidden Neurons (AAAI 2019)
Knowledge Distillation with Adversarial Samples Supporting Decision Boundary (AAAI 2019)
Group Sparsity: The Hinge Between Filter Pruning and Decomposition for Network Compression. CVPR2020.
This is the official implementation of "DHP: Differentiable Meta Pruning via HyperNetworks".
Pytorch implemenation of "Learning Filter Basis for Convolutional Neural Network Compression" ICCV2019
Deep Neural Network Compression based on Student-Teacher Network
李宏毅教授 ML 2020 機器學習課程筆記 & 實作
Overparameterization and overfitting are common concerns when designing and training deep neural networks. Network pruning is an effective strategy used to reduce or limit the network complexity, but often suffers from time and computational intensive procedures to identify the most important connections and best performing hyperparameters. We s…
MUSCO: Multi-Stage COmpression of neural networks
[NeurIPS 2021] Official PyTorch Code of Scaling Up Exact Neural Network Compression by ReLU Stability
Code implementation of our AISTATS'21 paper "Mirror Descent View for Neural Network Quantization"
2020 INTERSPEECH, "Sparse Mixture of Local Experts for Efficient Speech Enhancement".
Homework for Machine Learning (2019, Spring) at NTU
ACM SAC 2025 Accepted Paper
Add a description, image, and links to the network-compression topic page so that developers can more easily learn about it.
To associate your repository with the network-compression topic, visit your repo's landing page and select "manage topics."