You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I've encounter with problem. I convert model successfully from onnx to onnx surgeon graph. However, when I infer it, I get error:
sess = C.InferenceSession(session_options, self._model_path, True, self._read_config_from_model)
onnxruntime.capi.onnxruntime_pybind11_state.Fail: [ONNXRuntimeError] : 1 : FAIL : Load model from tts_surgeon.onnx failed:/onnxruntime_src/onnxruntime/core/graph/model.cc:146 onnxruntime::Model::Model(onnx::ModelProto&&, const PathString&, const IOnnxRuntimeOpSchemaRegistryList*, const onnxruntime::logging::Logger&, const onnxruntime::ModelOptions&) Unsupported model IR version: 9, max supported IR version: 8
``
My version:
onnx: 1.14.1
onnxruntime: 1.14.1
onnx surgeon: 0.5.2
With my check version from [onnxruntime](https://onnxruntime.ai/docs/reference/compatibility.html), I know that IR 8 is suitable for onnxruntime, but i dont know the way to check.
Thanks.
The text was updated successfully, but these errors were encountered:
phamkhactu
changed the title
[Question] about how to get IR version
[Question] about how to get IR version (onnx surgeon)
May 6, 2024
I change to IR 9 with onnxruntime=1.16.0 fixed this issue. But the time inference between pytorch --> onnx and onnx --> surgeon onnx the same. What do you think about it?
Thanks for excellent repo.
I've encounter with problem. I convert model successfully from onnx to onnx surgeon graph. However, when I infer it, I get error:
The text was updated successfully, but these errors were encountered: