Please install and setup AIMET before proceeding further. This evaluation was run using AIMET 1.22.2 for TensorFlow 1.15 i.e. please set release_tag="1.22.2"
and AIMET_VARIANT="tf_gpu_tf115"
in the above instructions.
This model requires the following python package versions:
pip install tensorflow-gpu==1.15.0
pip install keras==2.2.4
pip install progressbar2>=4.0.0
pip install --user git+https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI
It also required libGL1:
sudo apt update
sudo apt-get install libgl1 -y
Note that this model is expected not to work with GPUs at or after NVIDIA 30-series (e.g. RTX 3050), as those bring a new architecture not fully compatible with TF 1.X
-
Clone the RetinaNet repository from github: https://github.com/fizyr/keras-retinanet
git clone https://github.com/fizyr/keras-retinanet.git
cd keras-retinanet
-
Within the cloned repository, checkout the commit corresponding to pre-tf2.0. The included example scripts only works for TF 1.x.
git checkout 08af308d01a8f22dc286d62bc26c8496e1ff6539
-
Install keras-retinanet and dependencies using by running:
pip install . --user
export PYTHONPATH=$PYTHONPATH:<path to parent>/aimet-model-zoo
- The COCO dataset can be downloaded from here:
- The original pre-trained keras retinanet model is available here:
The evaluation script supports 4 actions: evaluating the original model on GPU ("original_fp32"); evaluating the original model on a simulated device ("original_int8"); evaluating the optimized model on GPU ("optimized_fp32"); evaluating the optimized model on a simulated device ("optimized_int8").
python3 retinanet_quanteval.py \
--dataset-Path <path to location of coco dataset> \
--action <one of: original_fp32, original_int8, optimized_fp32, optimized_int8>
- Weight quantization: 8 bits, per tensor asymmetric quantization
- Bias parameters are quantized
- Activation quantization: 8 bits, asymmetric quantization
(COCO dataset)
Average Precision/Recall | @[ IoU | area | maxDets] | FP32 | INT8 |
---|---|---|---|
Average Precision | @[ 0.50:0.95 | all | 100 ] | 0.350 | 0.349 |
Average Precision | @[ 0.50 | all | 100 ] | 0.537 | 0.536 |
Average Precision | @[ 0.75 | all | 100 ] | 0.374 | 0.372 |
Average Precision | @[ 0.50:0.95 | small | 100 ] | 0.191 | 0.187 |
Average Precision | @[ 0.50:0.95 | medium | 100 ] | 0.383 | 0.381 |
Average Precision | @[ 0.50:0.95 | large | 100 ] | 0.472 | 0.472 |
Average Recall | @[ 0.50:0.95 | all | 1 ] | 0.306 | 0.305 |
Average Recall | @[0.50:0.95 | all | 10 ] | 0.491 | 0.490 |
Average Recall | @[ 0.50:0.95 | all |100 ] | 0.533 | 0.532 |
Average Recall | @[ 0.50:0.95 | small | 100 ] | 0.345 | 0.341 |
Average Recall | @[ 0.50:0.95 | medium | 100 ] | 0.577 | 0.577 |
Average Recall | @[ 0.50:0.95 | large | 100 ] | 0.681 | 0.679 |