Skip to content

Commit

Permalink
optimization/quantization added into 500 series (#2197)
Browse files Browse the repository at this point in the history
Co-authored-by: Samet Akcay <samet.akcay@intel.com>
  • Loading branch information
paularamo and samet-akcay committed Aug 22, 2024
1 parent cfd3d8e commit 775f5a8
Showing 1 changed file with 39 additions and 0 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -437,6 +437,45 @@
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"For optimization and quantization process, we are using a seamless integration with [NNCF Library](https://github.com/openvinotoolkit/nncf) in the backend of Anomalib. Select one of the following options for optimization or quantization. Replace the openvino_model_path line above in order to export the optimized/quantized model:\n",
"\n",
"```\n",
"# Exporting optimized/quantized models\n",
"\n",
"# Post Training Quantization\n",
"openvino_model_path = engine.export(\n",
" model, \n",
" ExportType.OPENVINO, \n",
" str(Path.cwd()) + \"_optimized\", \n",
" compression_type=CompressionType.INT8_PTQ, \n",
" datamodule=datamodule\n",
" )\n",
"\n",
"# Accuracy-Control Quantization\n",
"openvino_model_path=engine.export(\n",
" model, \n",
" ExportType.OPENVINO, \n",
" str(Path.cwd()) + \"_optimized\", \n",
" compression_type=CompressionType.INT8_ACQ, \n",
" datamodule=datamodule, \n",
" metric=\"F1Score\"\n",
" )\n",
"\n",
"# Weight Compression\n",
"openvino_model_path=engine.export(\n",
" model, \n",
" ExportType.OPENVINO, \n",
" str(Path.cwd()) + \"_WEIGHTS\", \n",
" compression_type=CompressionType.FP16, \n",
" datamodule=datamodule\n",
" )\n",
"```"
]
},
{
"attachments": {},
"cell_type": "markdown",
Expand Down

0 comments on commit 775f5a8

Please sign in to comment.