DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
-
Updated
Jul 23, 2025 - Python
DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.
Easy-to-use and powerful LLM and SLM library with awesome model zoo.
Deduplicating archiver with compression and authenticated encryption.
Insane(ly slow but wicked good) PNG image optimization
AIMET is a library that provides advanced quantization and compression techniques for trained neural network models.
Extract files from any kind of container formats
A curated list for Efficient Large Language Models
Transformers-compatible library for applying various compression algorithms to LLMs for optimized deployment with vLLM
PaddleSlim is an open-source library for deep model compression and architecture search.
A toolkit to optimize ML models for deployment for Keras and TensorFlow, including quantization and pruning.
A PyTorch library and evaluation platform for end-to-end compression research
Code for CRATE (Coding RAte reduction TransformEr).
[CVPR 2020] GAN Compression: Efficient Architectures for Interactive Conditional GANs
Access large archives as a filesystem efficiently, e.g., TAR, RAR, ZIP, GZ, BZ2, XZ, ZSTD archives
Neural Network Compression Framework for enhanced OpenVINO™ inference
PyTorch Implementation of "Lossless Image Compression through Super-Resolution"
[NeurIPS 2023] LLM-Pruner: On the Structural Pruning of Large Language Models. Support Llama-3/3.1, Llama-2, LLaMA, BLOOM, Vicuna, Baichuan, TinyLlama, etc.
Data compression in TensorFlow
High Octane Triage Analysis
[ECCV] Swin2SR: SwinV2 Transformer for Compressed Image Super-Resolution and Restoration. Advances in Image Manipulation (AIM) workshop ECCV 2022. Try it out! over 3.3M runs https://replicate.com/mv-lab/swin2sr
Add a description, image, and links to the compression topic page so that developers can more easily learn about it.
To associate your repository with the compression topic, visit your repo's landing page and select "manage topics."