Exploring model compression algorithms and experimenting new ideas for NN pruning.
Use existing algorithms like IMP to generate various masks(which achieve a reasonable test accuracy) for channel pruning in a given network. Train a GAN which could learn the distribution of these good enough masks.
2 : Enhancement to AutoML, by using 'Learned Filter Pruning Criteria' instead of mere magnitude based criteria.
The idea is essentially to combine two papers, Learning Filter Pruning Criteria for Deep Convolutional Neural Networks Acceleration (CVPR 2020) and AMC: AutoML for Model Compression and Acceleration on Mobile Devices (ECCV 2018).
- https://pytorch.org/tutorials/intermediate/pruning_tutorial.html.
- https://pytorch.org/tutorials/beginner/blitz/cifar10_tutorial.html.
https://arxiv.org/pdf/2006.05929.pdf
https://openaccess.thecvf.com/content/CVPR2022/papers/Cazenavette_Dataset_Distillation_by_Matching_Training_Trajectories_CVPR_2022_paper.pdf