BNN is the implementation of a binary-native and gradient-free learning algorithm for Binary Neural Networks. BNN has been introduced in the paper: "Training Multi-Layer Binary Neural Networks With Random Local Binary Error Signals", published in the journal 'IOP Machine Learning: Science and Technology'.
Disclaimer: we are working on the topic, so, expect changes in this repo.
You can install them using uv
.
First of all you need to install uv (check here for instructions).
Then, you can install all you need using:
uv sync
Clone the repo, install the dependencies and run the src/train.py
script.
python3 src/train.py \
--algo-layer ["our"/"baldassi"] \
--dataset ["prototypes"/"fmnist"/"cifar10tl"/"imagenettetl"/"cifar100tl"] \
--binarize-dataset/--no-binarize-dataset \
--test-dim [int] \
--layers [str] \
--freeze-first/--no-freeze-first \
--freeze-last/--no-freeze-last \
--group-size [int] \
--bs [int] \
--epochs [int] \
--prob-reinforcement [float] \
--rob [float] \
--seed [int] \
--n-runs [int] \
--device [int] \
--log/--no-log \
python3 {sgd/adam}_torch.py \
--dataset ["prototypes"/"fmnist"/"bfmnist"/"cifar10tl"/"bcifar10tl"/"imagenettetl"/"bimagenettetl"/"cifar100tl"/"bcifar100tl"] \
--layers [str] \
--lr [float] \
--seed [int] \
--n-runs [int] \
--device [int] \
To visualize the results as in the paper, look at the VisualizeResults.ipynb
notebook.
If you have questions, suggestions or problems, feel free to open an Issue. You can contact us at: