- All functions (procedures) used on clients are stored in 'client.py', including different training functions for different algorithms.
- Functions for the server to execure are stored in ‘fl_functions’.
- Utilization tools are stored in 'functions_new.py'.
- 'exp_args' include all arguments for experiments.
- 'fl_server' stores the FedDIP algorithm. If no 'prune' and 'sparse', it reduces to Fedavg.
Notice: Use os.chdir() to change the directory to the folder of your project or use absolute path to run the experiments.
FedDIP: -python fl_server.py --epochs 1000 --num_users 50 --local_ep 5 --amount_sparsity 0.95 --model alexnet --dataset cifar10 --frac 0.1 --parallel 0 --num_workers 2 --reconfig_interval 5 --init_sparsity 0.5 --sparse --prune
FedDST: -python feddst.py --feddst --epochs 1000 --num_users 50 --local_ep 5 --amount_sparsity 0.90 --model resnet18 --dataset cifar100 --frac 0.1 --parallel 0 --num_workers 2 --reconfig_interval 20
Some code is modified based on the following repositories:
- https://github.com/rong-dai/DisPFL
- https://github.com/jiangyuang/PruneFL
- https://github.com/namhoonlee/snip-public
If you find this work useful, please consider citing:
@inproceedings{long2023feddip, title={Feddip: Federated learning with extreme dynamic pruning and incremental regularization}, author={Long, Qianyu and Anagnostopoulos, Christos and Parambath, Shameem Puthiya and Bi, Daning}, booktitle={2023 IEEE International Conference on Data Mining (ICDM)}, pages={1187--1192}, year={2023}, organization={IEEE} }