Release 1.6.0
New functionality:
- Support Intel® Optimized TensorFlow v2.1.0 (most models are also compatible with Intel® Optimized TensorFlow v2.0.0)
- Porting 13 supported models to TF2.0 API
- Add the management of pre-trained models with version control and update some INT8 pre-trained models
- Add multi-instance training mode support for the following models:
- ResNet50 v1.5
- Add MiniGo 9x9 support
- Updated mobilenet fp32 pb for better performance
- Add batch parallel support for Wide_Deep_Large_DS
- Enhance bare metal benchmark support
- Align the default container image to TF 2.1.0 image in intel namespace
Bug fixes:
- Fixed the SSD-Mobilenet to run in TF2.1.0
- Fixed the MLPerf-GNMT tensorflow-addons issue
- Fixed some security issues found by bandit scan
- Fixed for Pillow CVE-2019-19911 and CVE-2020-5313
Supported Models
Use Case | Model | Mode | Instructions |
---|---|---|---|
Image Recognition | DenseNet169 | Inference | FP32 |
Image Recognition | Inception V3 | Inference | Int8 FP32 |
Image Recognition | Inception V4 | Inference | Int8 FP32 |
Image Recognition | MobileNet V1* | Inference | Int8 FP32 |
Image Recognition | ResNet 101 | Inference | Int8 FP32 |
Image Recognition | ResNet 50 | Inference | Int8 FP32 |
Image Recognition | ResNet 50v1.5* | Inference | Int8 FP32 |
Image Recognition | ResNet 50v1.5* | Training | FP32 |
Reinforcement | MiniGo | Training | FP32 |
Language Translation | GNMT* | Inference | FP32 |
Language Translation | Transformer_LT_Official | Inference | FP32 |
Object Detection | R-FCN | Inference | Int8 FP32 |
Object Detection | SSD-MobileNet* | Inference | Int8 FP32 |
Object Detection | SSD-ResNet34* | Inference | Int8 FP32 |
Recommendation | Wide & Deep Large Dataset | Inference | Int8 FP32 |
Recommendation | Wide & Deep | Inference | FP32 |
Validated Configurations
Model Zoo v1.6.0 were validated on the following operating systems:
- Ubuntu 18.04, 16.04
- CentOS 7.6