Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ONNX model serving: brodcasting error in Where node. #1670

Closed
LoicDagnas opened this issue Aug 13, 2021 · 2 comments · Fixed by #1676
Closed

ONNX model serving: brodcasting error in Where node. #1670

LoicDagnas opened this issue Aug 13, 2021 · 2 comments · Fixed by #1676

Comments

@LoicDagnas
Copy link
Contributor

Describe the bug
After the resolution of this issue I am now able to convert my two models to ONNX format. But one of them is loadable but not serveable.

It gives the following error:

onnxruntime.capi.onnxruntime_pybind11_state.RuntimeException: [ONNXRuntimeError] : 6 : RUNTIME_EXCEPTION : Non-zero status code returned while running Where node. Name:'Where__462' Status Message: D:\a\_work\1\s\onnxruntime\core/providers/cpu/math/element_wise_ops.h:497 onnxruntime::BroadcastIterator::Init axis == 1 || axis == largest was false. Attempting to broadcast an axis by a dimension other than 1. 2 by 3

System information

  • OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10.0.19042
  • Tensorflow Version: 2.5.0
  • Python version: 3.7.6

To Reproduce
You'll fin the saved_model and the ONNX conversion in the archive droped here

To reproduce the error above, I just run:

import onnxruntime

sess_options = onnxruntime.SessionOptions()
sess_options.graph_optimization_level = onnxruntime.GraphOptimizationLevel.ORT_DISABLE_ALL
session = onnxruntime.InferenceSession('path/to/model.onnx', sess_options, providers=["CPUExecutionProvider"])

input_feed = {
    "categorical_ids": [[1] * 167],
    "text_word_ids": [[[101] + [102] + [0] * 254] * 2],
    "textualmetadata_word_ids": [[[101] + [102] + [0] * 254]]
}

output = session.run(output_names=None, input_feed=input_feed)
@TomWildenhain-Microsoft
Copy link
Contributor

This was my mistake. the RaggedTensorToTensor implementation didn't account for uniform (dense) dimensions. #1676 has a fix. I tested your model and it converts and produces matching results on your data.

@LoicDagnas
Copy link
Contributor Author

@TomWildenhain-Microsoft thank you, it does work 😄

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants