Skip to content

Latest commit

 

History

History
53 lines (35 loc) · 2.32 KB

README.md

File metadata and controls

53 lines (35 loc) · 2.32 KB

SNN-PAR: Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networks

Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networks, Haiyang Wang, Qian Zhu, Mowen She, Yabo Li, Haoyu Song, Minghe Xu, and Xiao Wang*

Abstract

Artificial neural network based Pedestrian Attribute Recognition (PAR) has been widely studied in recent years, despite many progresses, however, the energy consumption is still high. To address this issue, in this paper, we propose a Spiking Neural Network (SNN) based framework for energy-efficient attribute recognition. Specifically, we first adopt a spiking tokenizer module to transform the given pedestrian image into spiking feature representations. Then, the output will be fed into the spiking Transformer backbone networks for energy-efficient feature extraction. We feed the enhanced spiking features into a set of feedforward networks for pedestrian attribute recognition. In addition to the widely used binary cross-entropy loss function, we also exploit knowledge distillation from the artificial neural network to the spiking Transformer network for more accurate attribute recognition. Extensive experiments on three widely used PAR benchmark datasets fully validated the effectiveness of our proposed SNN-PAR framework.

🔧Requirements

Installation

pip install -r requirements.txt

Data Preparation

cd dataset/preprocess
python pa100k.py

Teacher Checkpoint

You can get the weights of the teacher model by training the VTB separately

🚀Training

python train.py PA100k --only_feats_kl  --only_logits_kl 

👍Acknowledgements

This code is based on VTB and Spikingformer. Thanks for their efforts.

Citation

If you think this work helps your research, please cite the following papers:

@article{wang2024SNNPAR,
  title={SNN-PAR: Energy Efficient Pedestrian Attribute Recognition via Spiking Neural Networks},
  author={Wang, Haiyang and Zhu, Qian and She, Mowen and Li, Yabo and Song, Haoyu and Xu, Minghe and Wang, Xiao},
  journal={arXiv preprint arXiv:2410.07857},
  year={2024}
}