Skip to content

Commit

Permalink
Merge branch 'releases/2024/3' into xufang/enable_CFG_for_tbb
Browse files Browse the repository at this point in the history
  • Loading branch information
peterchen-intel authored Oct 7, 2024
2 parents 9860eb6 + 49400d7 commit 740581d
Show file tree
Hide file tree
Showing 412 changed files with 28,676 additions and 41,464 deletions.
16 changes: 9 additions & 7 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -116,16 +116,18 @@
# Documentation
/docs/ @openvinotoolkit/openvino-docs-maintainers
/docs/CMakeLists.txt @openvinotoolkit/openvino-ie-maintainers
/**/*.rst @openvinotoolkit/openvino-docs-maintainers
/**/*.md @openvinotoolkit/openvino-docs-maintainers
/**/*.svg @openvinotoolkit/openvino-docs-maintainers
/docs/MO_DG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-mo-maintainers
/docs/OV_Runtime_UG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
/docs/IE_PLUGIN_DG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
/docs/Extensibility_UG/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
/docs/openvino-workflow/model-preparation/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ovc-maintainers
/docs/openvino-workflow/running-inference/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
/docs/openvino-extensibility/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
/docs/snippets/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
/docs/OV_Runtime_UG/supported_plugins/ARM_CPU.md @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino_contrib-arm_plugin-maintainers
/docs/OV_Runtime_UG/supported_plugins/CPU.md @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-cpu-maintainers
/docs/OV_Runtime_UG/supported_plugins/GPU*.md @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-gpu-maintainers
/docs/articles_en/assets/snippets/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-maintainers
/docs/openvino-workflow/running-inference/inference-devices-and-modes/cpu-device.rst @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-cpu-maintainers
/docs/openvino-workflow/running-inference/inference-devices-and-modes/cpu-device/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-cpu-maintainers
/docs/openvino-workflow/running-inference/inference-devices-and-modes/gpu-device.rst @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-gpu-maintainers
/docs/openvino-workflow/running-inference/inference-devices-and-modes/gpu-device/ @openvinotoolkit/openvino-docs-maintainers @openvinotoolkit/openvino-ie-gpu-maintainers

# Configuration management
/**/setup.py @openvinotoolkit/openvino-configuration-mgmt
Expand Down
2 changes: 1 addition & 1 deletion .github/dockerfiles/docker_tag
Original file line number Diff line number Diff line change
@@ -1 +1 @@
pr-25303
pr-26077
9 changes: 8 additions & 1 deletion .github/workflows/code_snippets.yml
Original file line number Diff line number Diff line change
Expand Up @@ -46,4 +46,11 @@ jobs:
run: cmake -DCMAKE_BUILD_TYPE=Release -DTHREADING=SEQ -B build

- name: Build snippets
run: cmake --build build --target openvino_docs_snippets --parallel
if: ${{ runner.os == 'Linux' || runner.os == 'macOS'}}
run: cmake --build build --target openvino_docs_snippets --parallel $(nproc)

- name: Build snippets Windows
if: ${{ runner.os == 'Windows'}}
shell: pwsh
run: cmake --build build --target openvino_docs_snippets --parallel $ENV:NUMBER_OF_PROCESSORS

3 changes: 3 additions & 0 deletions CMakeLists.txt
Original file line number Diff line number Diff line change
Expand Up @@ -92,6 +92,9 @@ endif()
if(NOT OV_LIBC_VERSION VERSION_EQUAL 0.0)
message (STATUS "LIBC_VERSION .......................... " ${OV_LIBC_VERSION})
endif()
if(DEFINED OPENVINO_STDLIB)
message (STATUS "STDLIB ................................ " ${OPENVINO_STDLIB})
endif()

# remove file with exported targets to force its regeneration
file(REMOVE "${CMAKE_BINARY_DIR}/OpenVINOTargets.cmake")
Expand Down
13 changes: 13 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -115,6 +115,7 @@ You can ask questions and get support on:
* OpenVINO channels on the [Intel DevHub Discord server](https://discord.gg/7pVRxUwdWG).
* The [`openvino`](https://stackoverflow.com/questions/tagged/openvino) tag on Stack Overflow\*.


## Additional Resources

* [Product Page](https://software.intel.com/content/www/us/en/develop/tools/openvino-toolkit.html)
Expand All @@ -123,6 +124,18 @@ You can ask questions and get support on:
* [OpenVINO™ toolkit on Medium](https://medium.com/@openvino)


## Telemetry

OpenVINO™ collects software performance and usage data for the purpose of improving OpenVINO™ tools.
This data is collected directly by OpenVINO™ or through the use of Google Analytics 4.
You can opt-out at any time by running the command:

``` bash
opt_in_out --opt_out
```

More Information is available at [OpenVINO™ Telemetry](https://docs.openvino.ai/2024/about-openvino/additional-resources/telemetry.html).

## License

OpenVINO™ Toolkit is licensed under [Apache License Version 2.0](LICENSE).
Expand Down
2 changes: 2 additions & 0 deletions cmake/developer_package/options.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,8 @@ if(POLICY CMP0127)
cmake_policy(SET CMP0127 NEW)
endif()

unset(OV_OPTIONS CACHE)

macro(ov_option variable description value)
option(${variable} "${description}" ${value})
list(APPEND OV_OPTIONS ${variable})
Expand Down
6 changes: 4 additions & 2 deletions cmake/developer_package/packaging/archive.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -102,5 +102,7 @@ ov_define_component_include_rules()

# New in version 3.18
set(CPACK_ARCHIVE_THREADS 8)
# multiple packages are generated
set(CPACK_ARCHIVE_COMPONENT_INSTALL ON)
# multiple packages are generated by default
if(NOT DEFINED CPACK_ARCHIVE_COMPONENT_INSTALL)
set(CPACK_ARCHIVE_COMPONENT_INSTALL ON)
endif()
23 changes: 20 additions & 3 deletions cmake/developer_package/target_flags.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,23 @@ get_property(OV_GENERATOR_MULTI_CONFIG GLOBAL PROPERTY GENERATOR_IS_MULTI_CONFIG

function(ov_detect_libc_type)
include(CheckCXXSourceCompiles)
check_cxx_source_compiles("
# include <string>
# ifndef _GLIBCXX_USE_CXX11_ABI
# error \"GlibCXX ABI is not defined\"
# endif
int main() {
return 0;
}"
OPENVINO_STDLIB_GNU)

if(OPENVINO_STDLIB_GNU)
set(OPENVINO_STDLIB "GNU" PARENT_SCOPE)
else()
set(OPENVINO_STDLIB "CPP" PARENT_SCOPE)
endif()

check_cxx_source_compiles("
# ifndef _GNU_SOURCE
# define _GNU_SOURCE
Expand All @@ -140,9 +157,9 @@ function(ov_detect_libc_type)
int main() {
return 0;
}"
OPENVINO_MUSL_LIBC)
OPENVINO_GLIBC_MUSL)

if(OPENVINO_MUSL_LIBC)
if(OPENVINO_GLIBC_MUSL)
set(OPENVINO_MUSL_LIBC ON PARENT_SCOPE)
else()
set(OPENVINO_GNU_LIBC ON PARENT_SCOPE)
Expand Down Expand Up @@ -213,7 +230,7 @@ ov_libc_version()
# Detects default value for _GLIBCXX_USE_CXX11_ABI for current compiler
#
macro(ov_get_glibcxx_use_cxx11_abi)
if(LINUX)
if(LINUX AND OPENVINO_STDLIB STREQUAL "GNU")
ov_get_compiler_definition("_GLIBCXX_USE_CXX11_ABI" OV_GLIBCXX_USE_CXX11_ABI)
endif()
endmacro()
Expand Down
4 changes: 4 additions & 0 deletions cmake/extra_modules.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -119,6 +119,10 @@ function(_ov_register_extra_modules)
add_library(${NS}::${exported_target_clean_name} ALIAS ${exported_target})
endif()\n")
endforeach()

configure_file("${OpenVINO_SOURCE_DIR}/cmake/templates/OpenVINOConfig-version.cmake.in"
"${OpenVINODeveloperPackage_DIR}/OpenVINODeveloperPackageConfig-version.cmake"
@ONLY)
endfunction()

_ov_generate_fake_developer_package("openvino")
Expand Down
5 changes: 2 additions & 3 deletions cmake/features.cmake
Original file line number Diff line number Diff line change
Expand Up @@ -34,9 +34,8 @@ endif()

ov_dependent_option (ENABLE_INTEL_GPU "GPU OpenCL-based plugin for OpenVINO Runtime" ${ENABLE_INTEL_GPU_DEFAULT} "X86_64 OR AARCH64;NOT APPLE;NOT WINDOWS_STORE;NOT WINDOWS_PHONE" OFF)

if (ANDROID OR MINGW OR (CMAKE_COMPILER_IS_GNUCXX AND CMAKE_CXX_COMPILER_VERSION VERSION_LESS 7.0) OR (NOT BUILD_SHARED_LIBS AND ENABLE_INTEL_CPU))
# oneDNN doesn't support old compilers and android builds for now, so we'll build GPU plugin without oneDNN
# also, in case of static build CPU's and GPU's oneDNNs will conflict, so we are disabling GPU's one in this case
if (ANDROID OR MINGW OR (CMAKE_COMPILER_IS_GNUCXX AND CMAKE_CXX_COMPILER_VERSION VERSION_LESS 7.0))
# oneDNN doesn't support old compilers and Android builds for now, so we'll build GPU plugin without oneDNN
set(ENABLE_ONEDNN_FOR_GPU_DEFAULT OFF)
else()
set(ENABLE_ONEDNN_FOR_GPU_DEFAULT ON)
Expand Down
17 changes: 8 additions & 9 deletions docs/articles_en/about-openvino/compatibility-and-support.rst
Original file line number Diff line number Diff line change
Expand Up @@ -7,18 +7,17 @@ Compatibility and Support
:hidden:

compatibility-and-support/supported-devices
compatibility-and-support/supported-operations
compatibility-and-support/supported-models
compatibility-and-support/supported-operations-inference-devices
compatibility-and-support/supported-operations-framework-frontend


:doc:`Supported Devices <compatibility-and-support/supported-devices>` - compatibility information for supported hardware accelerators.

:doc:`Supported Models <compatibility-and-support/supported-models>` - a list of selected models confirmed to work with given hardware.

:doc:`Supported Operations <compatibility-and-support/supported-operations-inference-devices>` - a listing of framework layers supported by OpenVINO.

:doc:`Supported Operations <compatibility-and-support/supported-operations-framework-frontend>` - a listing of layers supported by OpenVINO inference devices.
| :doc:`Supported Devices <compatibility-and-support/supported-devices>`:
| compatibility information for supported hardware accelerators.
| :doc:`Supported Operations <compatibility-and-support/supported-operations>`:
| a listing of operations supported by OpenVINO, based on device and frontend conformance.
| :doc:`AI Models verified for OpenVINO™ <compatibility-and-support/supported-models>`:
| a list of selected models confirmed to work with Intel® Core Ultra™ Processors with the
OpenVINO™ toolkit.
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
Supported Inference Devices
============================
Supported Devices
===============================================================================================

.. meta::
:description: Check the list of devices used by OpenVINO to run inference
of deep learning models.


The OpenVINO™ runtime enables you to use a selection of devices to run your
The OpenVINO™ runtime enables you to use the following devices to run your
deep learning models:
:doc:`CPU <../../openvino-workflow/running-inference/inference-devices-and-modes/cpu-device>`,
:doc:`GPU <../../openvino-workflow/running-inference/inference-devices-and-modes/gpu-device>`,
Expand All @@ -18,16 +18,20 @@ deep learning models:
Beside running inference with a specific device,
OpenVINO offers the option of running automated inference with the following inference modes:

* :doc:`Automatic Device Selection <../../openvino-workflow/running-inference/inference-devices-and-modes/auto-device-selection>` - automatically selects the best device
available for the given task. It offers many additional options and optimizations, including inference on
multiple devices at the same time.
* :doc:`Heterogeneous Inference <../../openvino-workflow/running-inference/inference-devices-and-modes/hetero-execution>` - enables splitting inference among several devices
automatically, for example, if one device doesn't support certain operations.
* :doc:`(LEGACY) Multi-device Inference <./../../documentation/legacy-features/multi-device>` - executes inference on multiple devices.
Currently, this mode is considered a legacy solution. Using Automatic Device Selection is advised.
* :doc:`Automatic Batching <../../openvino-workflow/running-inference/inference-devices-and-modes/automatic-batching>` - automatically groups inference requests to improve
device utilization.
| :doc:`Automatic Device Selection <../../openvino-workflow/running-inference/inference-devices-and-modes/auto-device-selection>`:
| automatically selects the best device available for the given task. It offers many
additional options and optimizations, including inference on multiple devices at the
same time.
| :doc:`Heterogeneous Inference <../../openvino-workflow/running-inference/inference-devices-and-modes/hetero-execution>`:
| enables splitting inference among several devices automatically, for example, if one device
doesn't support certain operations.
| :doc:`Automatic Batching <../../openvino-workflow/running-inference/inference-devices-and-modes/automatic-batching>`:
| automatically groups inference requests to improve device utilization.
| :doc:`(LEGACY) Multi-device Inference <./../../documentation/legacy-features/multi-device>`:
| executes inference on multiple devices. Currently, this mode is considered a legacy
solution. Using Automatic Device Selection instead is advised.

Feature Support and API Coverage
Expand All @@ -36,16 +40,17 @@ Feature Support and API Coverage
======================================================================================================================================== ======= ========== ===========
Supported Feature CPU GPU NPU
======================================================================================================================================== ======= ========== ===========
:doc:`Automatic Device Selection <../../openvino-workflow/running-inference/inference-devices-and-modes/auto-device-selection>` Yes Yes Partial
:doc:`Heterogeneous execution <../../openvino-workflow/running-inference/inference-devices-and-modes/hetero-execution>` Yes Yes No
:doc:`(LEGACY) Multi-device execution <./../../documentation/legacy-features/multi-device>` Yes Yes Partial
:doc:`Automatic batching <../../openvino-workflow/running-inference/inference-devices-and-modes/automatic-batching>` No Yes No
:doc:`Multi-stream execution <../../openvino-workflow/running-inference/optimize-inference/optimizing-throughput>` Yes Yes No
:doc:`Models caching <../../openvino-workflow/running-inference/optimize-inference/optimizing-latency/model-caching-overview>` Yes Partial Yes
:doc:`Model caching <../../openvino-workflow/running-inference/optimize-inference/optimizing-latency/model-caching-overview>` Yes Partial Yes
:doc:`Dynamic shapes <../../openvino-workflow/running-inference/dynamic-shapes>` Yes Partial No
:doc:`Import/Export <../../documentation/openvino-ecosystem>` Yes Yes Yes
:doc:`Preprocessing acceleration <../../openvino-workflow/running-inference/optimize-inference/optimize-preprocessing>` Yes Yes No
:doc:`Stateful models <../../openvino-workflow/running-inference/stateful-models>` Yes Yes Yes
:doc:`Extensibility <../../documentation/openvino-extensibility>` Yes Yes No
:doc:`(LEGACY) Multi-device execution <./../../documentation/legacy-features/multi-device>` Yes Yes Partial
======================================================================================================================================== ======= ========== ===========


Expand Down Expand Up @@ -80,10 +85,10 @@ topic (step 3 "Configure input and output").

.. note::

With OpenVINO 2024.0 release, support for GNA has been discontinued. To keep using it
With the OpenVINO 2024.0 release, support for GNA has been discontinued. To keep using it
in your solutions, revert to the 2023.3 (LTS) version.

With OpenVINO™ 2023.0 release, support has been cancelled for:
With the OpenVINO™ 2023.0 release, support has been cancelled for:
- Intel® Neural Compute Stick 2 powered by the Intel® Movidius™ Myriad™ X
- Intel® Vision Accelerator Design with Intel® Movidius™

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,14 +18,12 @@ models from OpenVINO-supported frameworks may also work properly but have not be
:file: ../../_static/download/supported_models.csv


Check marks indicate models that passed inference with no errors. Empty cells indicate models
that were not tested. No failing runs producing an error have been recorded.

In the precision column, the "optimum-intel default" label corresponds to FP32 for small models
and INT8 for models greater than 1B parameters.


| Note:
| Marked cells indicate models that passed inference with no errors. Empty cells indicate
models that were not tested. No failing runs producing an error have been recorded.
|
| In the precision column, the "optimum-intel default" label corresponds to FP32 for small models
and INT8 for models greater than 1B parameters.
|
| The results as of June 17 2024, for OpenVINO version 2024.2.
| The models come from different public model repositories, such as Pytorch Model Zoo and
HuggingFace; they were executed on the designated hardware with OpenVINO either natively or
Expand Down
Loading

0 comments on commit 740581d

Please sign in to comment.