This Repository contains the code for the Journal Paper:
Resource-Adaptive and OOD-Robust Inference of Deep Neural Networks on IoT Devices
Cailen Robertson, Thanh Tam Nguyen, Quoc Viet Hung Nguyen, Jun Jo
Fig 1. Output predictions at a branch exit with previous(Entropy) techniques compared to our (Energy) threshold and loss techniques.
This Repository contains both the source models for building branching models efficiently in Tensorflow and python, as well as providing several Jupyter Notebooks containing implementation for the building, training, and evaluating of branching models.
Early exiting is a deep learning model augmentation technique where additional classfifier exits are added to pre-existing model. The added classifier exits and their produced predictions become potential results for a given input, and if chosen as the accepted output, mean that the rest of the model's layers do not need to be processed, saving time and energy.
Each added branch to the model produces a potential prediction that can be chosen as the accepted result.
Brevis Net reduces the average processing cost of predictions across a range of classification DNN models.
This repository contains code to build and run early exit models in tensorflow 2 along with our novel contributions of loss function, model uncertanity measurement and exit thresholding.
/brevis contains all the nessecary code to build and run the early exiting models.
/notebooks contains examples of building and evaluating the early exiting models on a vareity of different DNN model types.
Tensorflow 2.+
Python 3.7 +
Jupyter
Clone the Repository
git clone https://github.com/SanityLacking/BrevisNet.git
Access the notebooks via Jupyter
cd BrevisNet
cd notebooks
jupyter lab
Open examplebranching.ipynb for a walk through of how the module is used.
notebooks/experiments contains notebooks to branch and evaluate each of the tested models from the journal experiment.
Pre-trained models can be built using scripts in /Brevis/Raw_Models/. Each model was trained on Cifar10 for a minimum of 50 epochs until convergence.
This project uses neptuneAI for logging of training data, this is completely optional and only active if the module is installed. to enable it,
pip install neptune-client
pip install neptune-tensorflow-keras
and add your project name and credentials to neptuneCredentials.py
Special thanks to BranchyNet who originally proposed the idea of branching models, and whose work this repo is inspired by.
Dirichlet Uncertanity loss functions inspired by works from Andrey Malinin
Energy based loss functions inspired by works from Sven Elflein and Will Grathwohl