Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix type parse error about AdaptiveMaxPool #15016

Merged
merged 2 commits into from
Jun 4, 2023

Conversation

jikechao
Copy link
Contributor

@jikechao jikechao commented Jun 3, 2023

This PR is similar to the pr-14837

This pr fixed a type parse bug when the out_size attribute of the AdaptiveMaxPool2d or AdaptiveMaxPool3d contains None, which means has the same size as the input. But the op can't be parsed by Relay and throws an error message:
"Check failed: (!checked_type.defined()) is false: Expected Array[PrimExpr], but got Array[index 1: relay.Constant]"

The op introduction in the PyTorch documentation
image

Bug triggering Script

import torch
from tvm import relay

m = torch.nn.AdaptiveMaxPool3d((3, None, None),)   
input_data=[torch.randn([1, 1, 3, 5, 6], dtype=torch.float32)]
trace = torch.jit.trace(m, input_data)
input_shapes = [('input0', torch.Size([1, 1, 3, 5, 6]))]

mod, params = relay.frontend.from_pytorch(trace, input_shapes)

Traceback

Traceback (most recent call last):
  File "test.py", line 13, in <module>
    mod, params = relay.frontend.from_pytorch(trace, input_shapes)
  File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/pytorch.py", line 4970, in from_pytorch
    outputs = converter.convert_operators(operator_nodes, outputs, ret_name)
  File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/pytorch.py", line 4240, in convert_operators
    _get_input_types(op_node, outputs, default_dtype=self.default_dtype),
  File "/workplace/software/tvm/tvm/python/tvm/relay/frontend/pytorch.py", line 1127, in adaptive_max_pool
    return op(data, output_size=output_size), None
  File "/workplace/software/tvm/tvm/python/tvm/relay/op/nn/nn.py", line 3435, in adaptive_max_pool3d
    return _make.adaptive_max_pool3d(data, output_size, layout, out_layout)
  File "/workplace/software/tvm/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 238, in __call__
    raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
  4: TVMFuncCall
  3: _ZN3tvm7runtime13Pac
  2: tvm::runtime::TypedPackedFunc<tvm::RelayExpr (tvm::RelayExpr, tvm::runtime::Array<tvm::PrimExpr, void>, tvm::runtime::String, tvm::runtime::String)>::AssignTypedLambda<tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::PrimExpr, void>, tvm::runtime::String, tvm::runtime::String)>(tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::PrimExpr, void>, tvm::runtime::String, tvm::runtime::String), std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const
  1: tvm::runtime::TVMMovableArgValueWithContext_::operator tvm::runtime::Array<tvm::PrimExpr, void><tvm::runtime::Array<tvm::PrimExpr, void> >() const
  0: _ZN3tvm7runtime6detail
  6: TVMFuncCall
  5: _ZN3tvm7runtime13Pac
  4: tvm::runtime::TypedPackedFunc<tvm::RelayExpr (tvm::RelayExpr, tvm::runtime::Array<tvm::PrimExpr, void>, tvm::runtime::String, tvm::runtime::String)>::AssignTypedLambda<tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::PrimExpr, void>, tvm::runtime::String, tvm::runtime::String)>(tvm::RelayExpr (*)(tvm::RelayExpr, tvm::runtime::Array<tvm::PrimExpr, void>, tvm::runtime::String, tvm::runtime::String), std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}::operator()(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*) const
  3: tvm::runtime::TVMMovableArgValueWithContext_::operator tvm::runtime::Array<tvm::PrimExpr, void><tvm::runtime::Array<tvm::PrimExpr, void> >() const
  2: tvm::runtime::TVMMovableArgValue_::operator tvm::runtime::Array<tvm::PrimExpr, void><tvm::runtime::Array<tvm::PrimExpr, void>, void>() const
  1: tvm::runtime::Array<tvm::PrimExpr, void> tvm::runtime::TVMPODValue_::AsObjectRef<tvm::runtime::Array<tvm::PrimExpr, void> >() const
  0: _ZN3tvm7runtime6detail
  File "/workplace/software/tvm/tvm/include/tvm/runtime/packed_func.h", line 777
TVMError: In function relay.op.nn._make.adaptive_max_pool3d(0: RelayExpr, 1: Array<PrimExpr>, 2: runtime.String, 3: runtime.String) -> RelayExpr: error while converting argument 1: [07:33:53] /workplace/software/tvm/tvm/include/tvm/runtime/packed_func.h:1866: 
---------------------------------------------------------------
An error occurred during the execution of TVM.
For more information, please see: https://tvm.apache.org/docs/errors.html
---------------------------------------------------------------
  Check failed: (!checked_type.defined()) is false: Expected Array[PrimExpr], but got Array[index 1: relay.Constant]

cc @Hzfengsy @echuraev

Fix the bug when the output_size=(3, None).
Crash message: Check failed: (!checked_type.defined()) is false: Expected Array[PrimExpr], but got Array[index 1: relay.Constant]
@tvm-bot
Copy link
Collaborator

tvm-bot commented Jun 3, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

  • No users to auto-tag found, no teams are specified in PR title See #10317 for details

Generated by tvm-bot

@masahi masahi merged commit 80079b6 into apache:main Jun 4, 2023
junrushao pushed a commit to junrushao/tvm that referenced this pull request Jun 22, 2023
* fix type parse error about max_pool

Fix the bug when the output_size=(3, None).
Crash message: Check failed: (!checked_type.defined()) is false: Expected Array[PrimExpr], but got Array[index 1: relay.Constant]

* add new test case to caputure bug in adaptive_max_pool
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants