Skip to content

Releases: intel/ai-reference-models

Intel Model Zoo v2.1.0

13 Nov 23:53
Compare
Choose a tag to compare

New Functionality

10 new TensorFlow workload containers and model packages that are available on the Intel® oneContainer Portal:

  • InceptionV3 FP32 Inference

  • InceptionV3 Int8 Inference

  • MobileNetV1 Int8 Inference

  • ResNet50 Int8 Inference

  • ResNet50v1.5 BFloat16 Inference

  • Transformer-LT MLPerf FP32 Training

  • SSD-ResNet34 FP32 Training

  • BERT Large BFloat16 Training

  • SSD-MobileNet Int8 Inference

  • RFCN Int8 Inference

DL Frameworks (TensorFlow)

TensorFlow models in the 2.0 release are validated on the following TensorFlow versions:

  • Intel Optimizations for TensorFlow v2.3.0 or v1.15.2 (select models)

  • Intel Optimizations for TensorFlow Serving v2.3.0

DL Frameworks (PyTorch)

  • PyTorch models in the 2.1.0 release are validated on the following PyTorch version:

  • PyTorch v1.5.0-rc3

Supported Configurations

  • Intel Model Zoo 2.0 is validated on the following environment:

  • Ubuntu 18.04 LTS

  • Python 3.6

  • Docker Server v19+

  • Docker Client v18+

Intel Model Zoo v2.0.0

21 Oct 00:01
Compare
Choose a tag to compare

New Functionality

Intel® Model Zoo 2.0 introduces tools and quickstart directories that support modular creation and distribution of containers and model packages for the zoo's most popular workloads. This version also re-releases several models compatible with TensorFlow v1.15.2 and is fully backward compatible with the previous usage paradigm (the launch_benchmark.py script) for launching workloads in docker and on bare metal. Future releases will feature more content under the quickstart directory which will also be available for download, with full documentation, on the Intel® oneContainer Portal.
All pre-trained models are now available for download from AWS S3 storage with same old bucket name intel-optimized-tensorflow.
So for example the command:

$ wget https://storage.googleapis.com/intel-optimized-tensorflow/models/v1_8/densenet169_fp32_pretrained_model.pb

should be replaced with the following to download from AWS S3 bucket:

$ wget https://intel-optimized-tensorflow.s3.cn-north-1.amazonaws.com.cn/models/v1_8/densenet169_fp32_pretrained_model.pb

New Topologies and Models (TensorFlow 1.15.2)

  • UNet
  • Faster R-CNN
  • Mask R-CNN
  • NCF
  • WaveNet

DL Frameworks (TensorFlow)

TensorFlow models in the 2.0 release are validated on the following TensorFlow versions:

  • Intel Optimizations for TensorFlow v2.3.0 or v1.15.2 (select models)
  • Intel Optimizations for TensorFlow serving v2.3.0

DL Frameworks (PyTorch)

PyTorch models in the 2.0 release are validated on the following PyTorch version:

  • PyTorch v1.5.0-rc3

Supported Configurations

Intel Model Zoo 2.0 is validated on the following environment:

  • Ubuntu 18.04 LTS
  • Python 3.6
  • Docker Server v19+
  • Docker Client v18+

Intel Model Zoo v1.8.1

18 Sep 21:27
Compare
Choose a tag to compare

New Topologies and Models:

  • DLRM (BFloat16) Training for Recommendation

DL frameworks (TensorFlow):

TensorFlow models in v1.8.1 release are validated on the following TensorFlow versions:

  • Intel Optimizations for TensorFlow v2.3.0
  • Intel Optimizations for TensorFlow serving v2.2

DL frameworks (PyTorch):

PyTorch models in v1.8.1 release are validated on the following PyTorch version:

  • PyTorch v1.5.0-rc3

Supported Configurations:

Intel Model Zoo v1.8.1 is validated on the following operating system:

  • Ubuntu 18.04 LTS
  • Python 3.6
  • Docker Server v19+
  • Docker Client v18+

Intel Model Zoo v1.8.0

22 Aug 00:18
Compare
Choose a tag to compare

New Topologies and Models:

  • New pre-trained model for SSD-MobileNet with more optimizations
  • BERT (FP32) Inference for Language Translation

New in TensorFlow Serving:

  • InceptionV3
  • ResNet50v1.5
  • Transformer-LT (Official)
  • SSD-MobileNet

New Tutorials:

  • TensorFlow – BERT Large BFloat16 Training
  • TensorFlow Serving – Installation Guide
  • TensorFlow Serving – General Best Practices
  • TensorFlow Serving – InceptionV3 and ResNet50v1.5
  • TensorFlow Serving – SSD-MobileNet and R-FCN
  • TensorFlow Serving – Transformer-LT (Official)

Bug fixes:

  • Fixed MLpref GNMT with TensorFlow 2.x version
  • Fixed SSD-ResNet34 FP32 Inference libGL.so.1 import error
  • Fixed unit tests and linting
  • Several minor bug fixes and improvements.

DL frameworks (TensorFlow):

Intel Model Zoo v1.8.0 is validated on the following TensorFlow versions:

  • Intel Optimizations for TensorFlow v2.2
  • Intel Optimizations for TensorFlow serving v2.2

Supported Configurations:

Intel Model Zoo v1.8.0 is validated on the following operating system:

  • Ubuntu 18.04 LTS
  • Python 3.6
  • Docker Server v19+
  • Docker Client v18+

Release v1.6.1

22 May 04:03
Compare
Choose a tag to compare

New functionality:

  • Added experimental BFloat16 support for following models
    • Resnet50 v1.5 Training & Inference
    • BERT-Large (Squad) Training & Inference
    • SSD-ResNet34 Training
    • Transformer LT Training
  • Added multi instance support training for following models
    • Resnet50 v1.5 Training
    • BERT-Large (Squad) Training
    • SSD-ResNet34 Training
    • Transformer LT Training
  • Added following new FP32 model scripts
    • BERT-Large (Squad) Training & Inference
    • SSD-ResNet34 Training
    • Transformer LT Training

Bug fixes:

  • Fixed Resnet50 v1.5 training issue while rerunning in same workspace without deleting checkpoints
    Several minor bug fixes and improvements.

Validated Configurations

  • Model Zoo v1.6.1 is validated on the following operating system:
    • Ubuntu 18.04

Release 1.6.0

22 Apr 01:04
Compare
Choose a tag to compare

New functionality:

  • Support Intel® Optimized TensorFlow v2.1.0 (most models are also compatible with Intel® Optimized TensorFlow v2.0.0)
  • Porting 13 supported models to TF2.0 API
  • Add the management of pre-trained models with version control and update some INT8 pre-trained models
  • Add multi-instance training mode support for the following models:
    • ResNet50 v1.5
  • Add MiniGo 9x9 support
  • Updated mobilenet fp32 pb for better performance
  • Add batch parallel support for Wide_Deep_Large_DS
  • Enhance bare metal benchmark support
  • Align the default container image to TF 2.1.0 image in intel namespace

Bug fixes:

  • Fixed the SSD-Mobilenet to run in TF2.1.0
  • Fixed the MLPerf-GNMT tensorflow-addons issue
  • Fixed some security issues found by bandit scan
  • Fixed for Pillow CVE-2019-19911 and CVE-2020-5313

Supported Models

Use Case Model Mode Instructions
Image Recognition DenseNet169 Inference FP32
Image Recognition Inception V3 Inference Int8 FP32
Image Recognition Inception V4 Inference Int8 FP32
Image Recognition MobileNet V1* Inference Int8 FP32
Image Recognition ResNet 101 Inference Int8 FP32
Image Recognition ResNet 50 Inference Int8 FP32
Image Recognition ResNet 50v1.5* Inference Int8 FP32
Image Recognition ResNet 50v1.5* Training FP32
Reinforcement MiniGo Training FP32
Language Translation GNMT* Inference FP32
Language Translation Transformer_LT_Official Inference FP32
Object Detection R-FCN Inference Int8 FP32
Object Detection SSD-MobileNet* Inference Int8 FP32
Object Detection SSD-ResNet34* Inference Int8 FP32
Recommendation Wide & Deep Large Dataset Inference Int8 FP32
Recommendation Wide & Deep Inference FP32

Validated Configurations

Model Zoo v1.6.0 were validated on the following operating systems:

  • Ubuntu 18.04, 16.04
  • CentOS 7.6

Release 1.5.0

08 Jan 16:43
Compare
Choose a tag to compare

New functionality:

  • Support Intel® Optimized TensorFlow v1.15.0 (most models are also compatible with Intel® Optimized TensorFlow v1.14.0)

  • Add the management of pre-trained models with version control and update some INT8 pre-trained models

  • Enable the inference accuracy and performance of Faster-RCNN & R-FCN with Python3

  • Add training mode support for the following models:
    ** GNMT
    ** SSD-ResNet34
    ** Wide&Deep_Large_Dataset

  • Enhance bare metal benchmark support

  • Update Wide&Deep_large_Dataset with large feature column optimization

Bug fixes:

  • Fix the inceptionv3, resnet101, resnet50, resnetv1.5 to run in TF1.14
  • Fix python dependency requirements for SSD-ResNet34, Faster-RCNN, and R-FCN
  • Fix MTCC bare metal test issues
  • Fix GNMT FP32 inference issues
  • Fix Mask-RCNN inference issues
  • Fix some code format check issues
  • Fix some security issues found by bandit scan
  • Correct TensorFlow session config for MobileNet-v1 benchmark
  • Fix Pillow version

Other changes:

  • Change the FP32 performance test of RFCN and Faster-RCNN from checkpoint to PB
  • Remove TensorFlow Server partial support
  • Remove models that were failing because of TensorFlow API changes

Release 1.4.1

09 Jul 22:47
acfdcf1
Compare
Choose a tag to compare

Bug Fixes

  • Fixed Pillow version used by unet model.
  • Updated unet model documentation to correct checkpoint path.
  • Fixed libsnd dependency issue for wavenet model.

Release 1.4.0

03 Jul 20:13
3a8206d
Compare
Choose a tag to compare

Release 1.4.0

New scripts:

  • lm-1b FP32 inference
  • MobileNet V1 Int8 inference
  • DenseNet 169 FP32 inference
  • SSD-VGG16 FP32 and Int8 inference
  • SSD-ResNet34 Int8 inference
  • ResNet50 v1.5 FP32 and Int8 inference
  • Inception V3 FP32 inference using TensorFlow Serving

Other script changes and bug fixes:

  • Updated SSD-MobileNet accuracy script to take a full path to the coco_val.records, rather than a directory
  • Added a deprecation warning for using checkpoint files
  • Changed Inception ResNet V2 FP32 to use a frozen graph rather than checkpoints
  • Added support for custom volume mounts when running with docker
  • Moved model default env var configs to config.json files
  • Added support for dummy data with MobileNet V1 FP32
  • Added support for TCMalloc (enabled by default for int8 models)
  • Updated model zoo unit test to use json files for model parameters
  • Made the reference file optional for Transformer LT performance testing
  • Added iteration time to accuracy scripts
  • Updated Transformer LT Official to support num_inter and num_intra threads
  • Fixed path to the calibration script for ResNet101 Int8

New tutorials:

  • Transformer LT inference using TensorFlow
  • Transformer LT inference using TensorFlow Serving
  • ResNet50 Int8 inference using TensorFlow Serving
  • SSD-MobileNet inference using TensorFlow Serving

Documentation updates:

  • Added Contribute.md doc with instructions on adding new models
  • Added note about setting environment variables when running on bare metal
  • Updated model README files to use TensorFlow 1.14.0 docker images (except for Wide and Deep int8)
  • Updated FasterRCNN Int8 README file to clarify that performance testing uses raw images
  • Fixed docker build command in the TensorFlow Serving Installation Guide
  • NCF documentation update to remove line of code that causes an error
  • Updated mlperf/inference branch and paths in README file

Known issues:

  • RFCN FP32 accuracy is not working with the gcr.io/deeplearning-platform-release/tf-cpu.1-14 docker image
  • The TensorFlow Serving Installation Guide still shows example commands that build version 1.13. This will be updated to 1.14 when the official TensorFlow Serving release tag exists. To build version 1.14 now, you can use one of the following values for TF_SERVING_VERSION_GIT_BRANCH in your multi-stage docker build: "1.14.0-rc0" or "r1.14".

v1.3.1

30 May 22:35
701d471
Compare
Choose a tag to compare

Revised language regarding performance expectations.