Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bell/fix onnx multi output failure #10

Open
wants to merge 20 commits into
base: river/fix_segmentfault_plugin_api_2.0
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
20 commits
Select commit Hold shift + click to select a range
0baadc5
change default value
songbell Nov 21, 2022
8eb2e47
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Nov 24, 2022
5aa4e7b
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Nov 29, 2022
8438661
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Dec 5, 2022
6b6484b
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Dec 27, 2022
b956852
fix case failure
songbell Dec 27, 2022
fc85ad3
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Feb 28, 2023
00141b6
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Mar 9, 2023
d2c2f29
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Jun 21, 2023
f74f417
fix post commit failure
songbell Jun 21, 2023
968d0b9
Merge branch 'master' of https://github.com/openvinotoolkit/openvino
songbell Jun 29, 2023
62a5d72
fix sdl issues
songbell Jun 29, 2023
a0228a5
Merge branch 'master' into bell/fix_sdl_issues
songbell Jun 29, 2023
6be030b
Fixed SpaceToBatch and BatchToSpace for 3d case (#18033)
steve-y Jul 3, 2023
cb8d34d
[DOCS] adjustments for ST and cookie policy (#18315)
kblaszczak-intel Jul 3, 2023
36fedf8
test onnx failures, reuse port tensors
songbell Jul 3, 2023
d0d16b6
Merge branch 'river/fix_segmentfault_plugin_api_2.0' of https://githu…
songbell Jul 3, 2023
28bc478
Merge branch 'master' of https://github.com/openvinotoolkit/openvino …
songbell Jul 3, 2023
efaaa58
do not introduce new name
songbell Jul 3, 2023
e846c4b
clang
songbell Jul 3, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/_static/css/custom.css
Original file line number Diff line number Diff line change
Expand Up @@ -21,8 +21,9 @@ pre {
white-space: pre-wrap;
word-wrap: break-word;
}
/* cookie wap requirement */
a#wap_dns {display: none;}
/* Sphinx-design tabs override */

.sd-tab-set>input:checked+label {
border-color: var(--sd-color-tabs-underline-inactive);
color: var(--sd-color-info-text)!important;
Expand Down
2 changes: 1 addition & 1 deletion docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ OpenVINO™ Documentation
.. toctree::

home
Install <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>
Install <openvino_docs_install_guides_overview>
Blog <https://blog.openvino.ai/>
Forum <https://community.intel.com/t5/Intel-Distribution-of-OpenVINO/bd-p/distribution-openvino-toolkit>
Support <https://www.intel.com/content/www/us/en/support/products/96066/software/development-software/openvino-toolkit.html>
Expand Down
2 changes: 1 addition & 1 deletion docs/install_guides/installing-model-dev-tools.md
Original file line number Diff line number Diff line change
Expand Up @@ -205,7 +205,7 @@ Additional Resources

- `Intel® Distribution of OpenVINO™ toolkit home page <https://software.intel.com/en-us/openvino-toolkit>`__
- For IoT Libraries & Code Samples, see `Intel® IoT Developer Kit <https://github.com/intel-iot-devkit>`__ .
- `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__


@endsphinxdirective

3 changes: 0 additions & 3 deletions docs/install_guides/installing-openvino-apt.md
Original file line number Diff line number Diff line change
Expand Up @@ -273,10 +273,7 @@ You can also try the following:
* Take a glance at the OpenVINO product home page: https://software.intel.com/en-us/openvino-toolkit.


Additional Resources
#######################################

- `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__



Expand Down
3 changes: 0 additions & 3 deletions docs/install_guides/installing-openvino-brew.md
Original file line number Diff line number Diff line change
Expand Up @@ -101,9 +101,6 @@ Now that you've installed OpenVINO Runtime, you can try the following things:
* See sample applications in :doc:`OpenVINO toolkit Samples Overview <openvino_docs_OV_UG_Samples_Overview>`.
* Check out the OpenVINO product home page: https://software.intel.com/en-us/openvino-toolkit.

Additional Resources
####################

* `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__

@endsphinxdirective
1 change: 0 additions & 1 deletion docs/install_guides/installing-openvino-conda.md
Original file line number Diff line number Diff line change
Expand Up @@ -114,7 +114,6 @@ Additional Resources

* `OpenVINO Runtime Conda Forge <https://anaconda.org/conda-forge/openvino>`__
* :doc:`OpenVINO™ Toolkit Samples Overview <openvino_docs_OV_UG_Samples_Overview>`
* `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__


@endsphinxdirective
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -309,7 +309,6 @@ Additional Resources
* Sample applications: :doc:`OpenVINO™ Toolkit Samples Overview <openvino_docs_OV_UG_Samples_Overview>`
* Pre-trained deep learning models: :doc:`Overview of OpenVINO™ Toolkit Pre-Trained Models <model_zoo>`
* IoT libraries and code samples in the GitHub repository: `Intel® IoT Developer Kit <https://github.com/intel-iot-devkit>`__
* `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__



Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -163,7 +163,6 @@ To uninstall the toolkit, follow the steps on the :doc:`Uninstalling page <openv
Additional Resources
####################

* `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__
* :ref:`Troubleshooting Guide for OpenVINO Installation & Configuration <troubleshooting guide for install>`
* Converting models for use with OpenVINO™: :ref:`Model Optimizer User Guide <deep learning model optimizer>`
* Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide <deep learning openvino runtime>`
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -218,7 +218,6 @@ To uninstall OpenVINO, follow the steps on the :doc:`Uninstalling page <openvino
Additional Resources
####################

* `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__
* :ref:`Troubleshooting Guide for OpenVINO Installation & Configuration <troubleshooting guide for install>`
* Converting models for use with OpenVINO™: :ref:`Model Optimizer Developer Guide <deep learning model optimizer>`
* Writing your own OpenVINO™ applications: :ref:`OpenVINO™ Runtime User Guide <deep learning openvino runtime>`
Expand Down
3 changes: 1 addition & 2 deletions docs/install_guides/installing-openvino-linux-header.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,8 +25,7 @@ If you want to install OpenVINO™ Runtime on your Linux machine, these are your
* :doc:`Install OpenVINO Runtime using Homebrew <openvino_docs_install_guides_installing_openvino_brew>`
* :doc:`Install OpenVINO using Docker <openvino_docs_install_guides_installing_openvino_docker_linux>`

For a full selection of distribution channels, see the
`OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__



@endsphinxdirective
Expand Down
15 changes: 0 additions & 15 deletions docs/install_guides/installing-openvino-linux.md

This file was deleted.

2 changes: 0 additions & 2 deletions docs/install_guides/installing-openvino-macos-header.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,5 @@ If you want to install OpenVINO™ Runtime on macOS, there are a few ways to acc
* :doc:`Install OpenVINO Runtime via Homebrew <openvino_docs_install_guides_installing_openvino_brew>`


For a full selection of distribution channels,
see the `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__

@endsphinxdirective
1 change: 0 additions & 1 deletion docs/install_guides/installing-openvino-macos.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,5 @@ Currently only the following ways are provided to install OpenVINO™ on macOS:

The other installation methods are temporarily unavailable.

For a full selection of distribution channels, see the [OpenVINO Installation Selector Tool](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html)

@endsphinxdirective
22 changes: 7 additions & 15 deletions docs/install_guides/installing-openvino-overview.md
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
# Installing Intel® Distribution of OpenVINO™ Toolkit {#openvino_docs_install_guides_overview}
# Install Intel® Distribution of OpenVINO™ Toolkit {#openvino_docs_install_guides_overview}

@sphinxdirective

Expand All @@ -11,16 +11,6 @@
Build from Source <https://github.com/openvinotoolkit/openvino/blob/master/docs/dev/build.md>
Creating a Yocto Image <openvino_docs_install_guides_installing_openvino_yocto>

Intel® Distribution of OpenVINO™ toolkit is a comprehensive toolkit for developing applications and solutions based on deep learning tasks, such as computer vision, automatic speech recognition, natural language processing, recommendation systems, and more. It provides high-performance and rich deployment options, from edge to cloud. Some of its advantages are:

* Enables CNN-based and transformer-based deep learning inference on the edge or cloud.
* Supports various execution modes across Intel® technologies: Intel® CPU, Intel® Integrated Graphics, Intel® Discrete Graphics, and more.
* Speeds time-to-market via an easy-to-use library of computer vision functions and pre-optimized kernels.
* Compatible with models from a wide variety of frameworks, including TensorFlow, PyTorch, PaddlePaddle, ONNX, and more.


Install OpenVINO
################

.. raw:: html

Expand All @@ -29,7 +19,8 @@ Install OpenVINO
<iframe id="selector" src="_static/selector-tool/selector-8db148d.html" style="width: 100%; border: none" title="Download Intel® Distribution of OpenVINO™ Toolkit"></iframe>


OpenVINO installation package is distributed in two parts: OpenVINO Runtime and OpenVINO Development Tools.
Distribution channels of OpenVINO may differ slightly, with regard to supported hardware or available APIs (read installation guides for particular distributions for more details).
Moreover, OpenVINO Runtime and OpenVINO Development Tools offer different sets of tools, as follows:

* **OpenVINO Runtime** contains the core set of libraries for running machine learning model inference on processor devices.
* **OpenVINO Development Tools** is a set of utilities for working with OpenVINO and OpenVINO models. It includes the following tools:
Expand All @@ -39,7 +30,8 @@ OpenVINO installation package is distributed in two parts: OpenVINO Runtime and
- Accuracy Checker and Annotation Converter
- Model Downloader and other Open Model Zoo tools

Option 1. Install OpenVINO Runtime and OpenVINO Development Tools (recommended)

Install OpenVINO Development Tools (recommended)
+++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++

The best way to get started with OpenVINO is to install OpenVINO Development Tools, which will also install the OpenVINO Runtime Python package as a dependency. Follow the instructions on the :doc:`Install OpenVINO Development Tools <openvino_docs_install_guides_install_dev_tools>` page to install it.
Expand All @@ -52,7 +44,7 @@ For developers working in Python, OpenVINO Development Tools can easily be insta

For developers working in C++, the core OpenVINO Runtime libraries must be installed separately. Then, OpenVINO Development Tools can be installed using requirements files or PyPI. See the :ref:`For C++ Developers <openvino_docs_install_guides_install_dev_tools.html#cpp_developers>` section of the Install OpenVINO Development Tools page for instructions.

Option 2. Install OpenVINO Runtime only
Install OpenVINO Runtime only
+++++++++++++++++++++++++++++++++++++++

OpenVINO Runtime may also be installed on its own without OpenVINO Development Tools. This is recommended for users who already have an optimized model and want to deploy it in an application that uses OpenVINO for inference on their device. To install OpenVINO Runtime only, follow the instructions on the :doc:`Install OpenVINO Runtime <openvino_docs_install_guides_install_runtime>` page.
Expand All @@ -63,7 +55,7 @@ The following methods are available to install OpenVINO Runtime:
* Windows: You can install OpenVINO Runtime using archive files or Docker. See :doc:`Install OpenVINO on Windows <openvino_docs_install_guides_installing_openvino_windows_header>`.
* macOS: You can install OpenVINO Runtime using archive files or Docker. See :doc:`Install OpenVINO on macOS <openvino_docs_install_guides_installing_openvino_macos_header>`.

Option 3. Build OpenVINO from source
Build OpenVINO from source
++++++++++++++++++++++++++++++++++++

Source files are also available in the OpenVINO Toolkit GitHub repository. If you want to build OpenVINO from source for your platform, follow the `OpenVINO Build Instructions <https://github.com/openvinotoolkit/openvino/blob/master/docs/dev/build.md>`__ .
Expand Down
1 change: 0 additions & 1 deletion docs/install_guides/installing-openvino-pip.md
Original file line number Diff line number Diff line change
Expand Up @@ -127,6 +127,5 @@ Additional Resources

- Intel® Distribution of OpenVINO™ toolkit home page: https://software.intel.com/en-us/openvino-toolkit
- For IoT Libraries & Code Samples, see `Intel® IoT Developer Kit <https://github.com/intel-iot-devkit>`__.
- `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__

@endsphinxdirective
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,5 @@ Additional Resources
* Sample applications: :ref:`OpenVINO™ Toolkit Samples Overview <code samples>`
* Pre-trained deep learning models: :ref:`Overview of OpenVINO™ Toolkit Pre-Trained Models <model zoo>`
* IoT libraries and code samples in the GitHUB repository: `Intel® IoT Developer Kit <https://github.com/intel-iot-devkit>`__
* :ref:`OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`

@endsphinxdirective
2 changes: 0 additions & 2 deletions docs/install_guides/installing-openvino-windows-header.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,5 @@ If you want to install OpenVINO™ Runtime on Windows, you have the following op
* :doc:`Install OpenVINO Runtime using Conda Forge <openvino_docs_install_guides_installing_openvino_conda>`
* :doc:`Install OpenVINO using Docker <openvino_docs_install_guides_installing_openvino_docker_linux>`

For a full selection of distribution channels,
see the `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__

@endsphinxdirective
2 changes: 1 addition & 1 deletion docs/install_guides/installing-openvino-windows.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,4 +9,4 @@ Currently only the following ways are provided to install OpenVINO™:

The other installation methods are temporarily unavailable.

For a full selection of distribution channels, see the [OpenVINO Installation Selector Tool](https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html)

2 changes: 1 addition & 1 deletion docs/install_guides/installing-openvino-yocto.md
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ Additional Resources
- `Meta-intel <https://git.yoctoproject.org/meta-intel/tree/README>`__
- `Meta-openembedded <http://cgit.openembedded.org/meta-openembedded/tree/README>`__
- `Meta-clang <https://github.com/kraj/meta-clang/tree/master/#readme>`__
- `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__


@endsphinxdirective

5 changes: 0 additions & 5 deletions docs/install_guides/installing-openvino-yum.md
Original file line number Diff line number Diff line change
Expand Up @@ -240,11 +240,6 @@ You can also try the following things:



Additional Resources
#####################

- `OpenVINO Installation Selector Tool <https://www.intel.com/content/www/us/en/developer/tools/openvino-toolkit/download.html>`__


@endsphinxdirective

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -3,5 +3,6 @@
<a href="https://www.intel.com/content/www/us/en/legal/terms-of-use.html" alt="terms of use">Terms of Use</a>
<a href="https://www.intel.com/content/www/us/en/privacy/intel-cookie-notice.html" data-cookie-notice="true" alt="cookies policy">Cookies</a>
<a href="https://www.intel.com/content/www/us/en/privacy/intel-privacy-notice.html" alt="Privacy">Privacy</a>
<a data-wap_ref="dns" id="wap_dns" href="https://www.intel.com/content/www/us/en/privacy/intel-cookienotice.html">Do Not Share My Personal Information</a>
</p>
<p style="font-size: 0.8em">Intel, the Intel logo, and other Intel marks are trademarks of Intel Corporation or its subsidiaries. Other names and brands may be claimed as the property of others.</p>
4 changes: 2 additions & 2 deletions docs/resources/release_notes.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,15 +4,15 @@

.. raw:: html

<meta http-equiv="Refresh" content="0; url='https://software.intel.com/content/www/us/en/develop/articles/openvino-relnotes.html'" />
<meta http-equiv="Refresh" content="0; url='https://www.intel.com/content/www/us/en/developer/articles/release-notes/openvino/2023-0.html'" />


.. toctree::
:hidden:

prerelease_information

The official OpenVINO Release Notes are published at `intel.com <https://software.intel.com/content/www/us/en/develop/articles/openvino-relnotes.html>`__
The official OpenVINO Release Notes are published at `www.intel.com <https://www.intel.com/content/www/us/en/developer/articles/release-notes/openvino/2023-0.html>`__


@endsphinxdirective
2 changes: 1 addition & 1 deletion src/core/src/op/space_to_batch.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -102,7 +102,7 @@ bool ngraph::op::v1::SpaceToBatch::evaluate_space_to_batch(const HostTensorVecto

auto data_shape = data->get_shape();

if (!(data->get_shape().size() == 4 || data->get_shape().size() == 5)) {
if (!(data->get_shape().size() == 3 || data->get_shape().size() == 4 || data->get_shape().size() == 5)) {
return false;
}

Expand Down
4 changes: 2 additions & 2 deletions src/inference/src/dev/icompiled_model.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -59,7 +59,7 @@ ov::ICompiledModel::ICompiledModel(const std::shared_ptr<const ov::Model>& model
" Please use MO to generate new IR version, it should allow to avoid the issue");
leaf_names.insert(param_name);
}
param->output(0).get_tensor().set_names({param_name});
// param->output(0).get_tensor().set_names({param_name});
new_param->set_element_type(param->get_element_type());
new_param->set_layout(param->get_layout());
new_param->output(0).get_rt_info() = param->output(0).get_rt_info();
Expand All @@ -83,7 +83,7 @@ ov::ICompiledModel::ICompiledModel(const std::shared_ptr<const ov::Model>& model
" Please use MO to generate new IR version, it should allow to avoid the issue");
leaf_names.insert(res_name);
}
new_result->output(0).get_tensor().add_names({res_name});
// new_result->output(0).get_tensor().add_names({res_name});
auto r = std::dynamic_pointer_cast<ov::op::v0::Result>(new_result);
OPENVINO_ASSERT(r, "Internal error. set outputs failure casting output copy to Result");
r->set_layout(result->get_layout());
Expand Down
4 changes: 2 additions & 2 deletions src/inference/src/dev/isync_infer_request.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -263,10 +263,10 @@ void ov::ISyncInferRequest::allocate_tensor(const ov::Output<const ov::Node>& po
void ov::ISyncInferRequest::check_tensors() const {
const auto& inputs = m_compiled_model->inputs();
for (size_t i = 0; i < inputs.size(); i++) {
check_tensor(inputs[i], m_tensors.at(inputs[i].get_tensor_ptr()));
check_tensor(inputs[i], get_ref_tensor(inputs[i]));
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

get_ref_tensor() can avoid tensor mismatch issue, but m_tensors.at(inputs[i].get_tensor_ptr()) itself should not have any problem?

Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We can see that the compiled_mode are different:

  1. original model:
    image

  2. IF not create input/output port, the final compile model:
    image

  3. If create input/output port, the final compile model, do in this PR, the final compile model will keep no change after tranformation

image

It seems that create input/output port will effect on the transformation, it is surprising.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

did you remove the lines of set_names?

}
const auto& outputs = m_compiled_model->outputs();
for (size_t i = 0; i < outputs.size(); i++) {
check_tensor(outputs[i], m_tensors.at(outputs[i].get_tensor_ptr()));
check_tensor(outputs[i], get_ref_tensor(outputs[i]));
}
}
2 changes: 1 addition & 1 deletion src/plugins/intel_gpu/src/plugin/ops/batch_to_space.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ static void CreateBatchToSpaceOp(Program& p, const std::shared_ptr<ngraph::op::v

std::vector<int32_t> sizes = inConst->cast_vector<int32_t>();
int32_t default_size = i == 1 ? 1 : 0;
for (size_t s = sizes.size(); s < rank; s++) {
for (size_t s = sizes.size(); s < format.dimension(); s++) {
sizes.push_back(default_size);
}
tensor_inputs.emplace_back(format, sizes, default_size);
Expand Down
6 changes: 3 additions & 3 deletions src/plugins/intel_gpu/src/plugin/ops/space_to_batch.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -31,7 +31,7 @@ static void CreateSpaceToBatchOp(Program& p, const std::shared_ptr<ngraph::op::v

std::vector<int32_t> sizes = inConst->cast_vector<int32_t>();
int32_t default_size = i == 1 ? 1 : 0;
for (size_t s = sizes.size(); s < rank; s++) {
for (size_t s = sizes.size(); s < format.dimension(); s++) {
sizes.push_back(default_size);
}
tensor_inputs.emplace_back(format, sizes, default_size);
Expand All @@ -41,14 +41,14 @@ static void CreateSpaceToBatchOp(Program& p, const std::shared_ptr<ngraph::op::v
// To be removed once we enable internal shape infer for all operations
auto out_size = output_pshape.is_static() ? tensor_from_dims(output_pshape.to_shape()) : cldnn::tensor();

auto batchToSpacePrim = cldnn::space_to_batch(layerName,
auto spaceToBatchPrim = cldnn::space_to_batch(layerName,
inputs[0], // input
tensor_inputs[0], // block_shape
tensor_inputs[1], // crops_begin
tensor_inputs[2], // crops_end
out_size);

p.add_primitive(*op, batchToSpacePrim);
p.add_primitive(*op, spaceToBatchPrim);
}

REGISTER_FACTORY_IMPL(v1, SpaceToBatch);
Expand Down
Loading