Skip to content

Codes of the paper: Towards Energy Efficient Spiking Neural Networks: An Unstructured Pruning Framework (ICLR2024)

License

Notifications You must be signed in to change notification settings

xyshi2000/Unstructured-Pruning

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

3 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Towards Energy Efficient Spiking Neural Networks: An Unstructured Pruning Framework

Installing Dependencies

pip install torch torchvision
pip install tensorboard thop spikingjelly==0.0.0.0.12

Usage

To reproduce the experiments on CIFAR10 in the paper, simply follow the default settings

python main.py

You can specify the output path and the weight of penalty term $\lambda$ by

python main.py --penalty-lmbda <lambda> --output-dir <path>

To reproduce the experiments on other datasets, follow the settings in the appendix.

Citation

@inproceedings{shi2024towards,
  title={Towards Energy Efficient Spiking Neural Networks: An Unstructured Pruning Framework},
  author={Shi, Xinyu and Ding, Jianhao and Hao, Zecheng and Yu, Zhaofei},
  booktitle={The Twelfth International Conference on Learning Representations},
  year={2024}
}

About

Codes of the paper: Towards Energy Efficient Spiking Neural Networks: An Unstructured Pruning Framework (ICLR2024)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages