Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Keras][Bugfix] fix a bug about alpha attribute in LeakyReLU which lead to passes conflict #14707

Merged
merged 10 commits into from
Apr 28, 2023

Conversation

jikechao
Copy link
Contributor

@jikechao jikechao commented Apr 23, 2023

The alpha attribute in LeakyReLU lacks exception checking. If the alpha=nan , the keras frontend can convert it to relay ir successful, but in the optimization stage, it will trigger an unexpected crash and throws TVMError: Observed 100 rewrite passes, possible conflicting passes?

The relay IR is the following:
image

This patch fixes the bug!

The StackTrace:

    model = relay.build_module.create_executor("graph", mod, tvm.cpu(0), 'llvm', params).evaluate()  # crash
  File "/workplace/software/tvm/tvm/python/tvm/relay/backend/interpreter.py", line 171, in evaluate
    return self._make_executor()
  File "/workplace/software/tvm/tvm/python/tvm/relay/build_module.py", line 519, in _make_executor
    mod = build(self.mod, target=self.target)
  File "/workplace/software/tvm/tvm/python/tvm/relay/build_module.py", line 372, in build
    mod_name=mod_name,
  File "/workplace/software/tvm/tvm/python/tvm/relay/build_module.py", line 169, in build
    mod_name,
  File "/workplace/software/tvm/tvm/python/tvm/_ffi/_ctypes/packed_func.py", line 237, in __call__
    raise get_last_ffi_error()
tvm._ffi.base.TVMError: Traceback (most recent call last):
  14: TVMFuncCall
  13: tvm::relay::backend::RelayBuildModule::GetFunction(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, tvm::runtime::ObjectPtr<tvm::runtime::Object> const&)::{lambda(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)#3}::operator()(tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*) const
  12: tvm::relay::backend::RelayBuildModule::Build(tvm::IRModule, tvm::runtime::Array<tvm::Target, void> const&, tvm::Target const&, tvm::relay::Executor const&, tvm::relay::Runtime const&, tvm::WorkspaceMemoryPools const&, tvm::ConstantMemoryPools const&, tvm::runtime::String)
  11: tvm::relay::backend::RelayBuildModule::BuildRelay(tvm::IRModule, tvm::runtime::String const&)
  10: tvm::relay::backend::RelayBuildModule::OptimizeImpl(tvm::IRModule)
  9: tvm::transform::Pass::operator()(tvm::IRModule) const
  8: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  7: tvm::transform::SequentialNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  6: tvm::transform::Pass::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  5: tvm::relay::transform::FunctionPassNode::operator()(tvm::IRModule, tvm::transform::PassContext const&) const
  4: tvm::runtime::PackedFuncObj::Extractor<tvm::runtime::PackedFuncSubObj<tvm::runtime::TypedPackedFunc<tvm::relay::Function (tvm::relay::Function, tvm::IRModule, tvm::transform::PassContext)>::AssignTypedLambda<tvm::relay::transform::SimplifyExpr()::$_0>(tvm::relay::transform::SimplifyExpr()::$_0)::{lambda(tvm::runtime::TVMArgs const&, tvm::runtime::TVMRetValue*)#1}> >::Call(tvm::runtime::PackedFuncObj const*, tvm::runtime::TVMArgs, tvm::runtime::TVMRetValue*)
  3: tvm::relay::SimplifyExpr(tvm::RelayExpr const&, tvm::IRModule const&)
  2: tvm::relay::RewritePatterns(tvm::runtime::Array<tvm::relay::DFPatternCallback, void>, tvm::RelayExpr, tvm::IRModule)
  1: tvm::relay::PatternRewriter::Rewrite(tvm::runtime::Array<tvm::relay::DFPatternCallback, void> const&, tvm::RelayExpr const&)
  0: _ZN3tvm7runtime6detail
  File "/workplace/software/tvm/tvm/src/relay/ir/dataflow_matcher.cc", line 829
TVMError: Observed 100 rewrite passes, possible conflicting passes?

@tvm-bot
Copy link
Collaborator

tvm-bot commented Apr 23, 2023

Thanks for contributing to TVM! Please refer to the contributing guidelines https://tvm.apache.org/docs/contribute/ for useful information and tips. Please request code reviews from Reviewers by @-ing them in a comment.

  • No users to tag found in teams: keras, bugfix See #10317 for details

Generated by tvm-bot

@jikechao
Copy link
Contributor Author

@yongwww Could your willing to review this PR? Many thanks!

@Hzfengsy
Copy link
Member

Please add a regression test for it.

Copy link
Contributor

@AndrewZhaoLuo AndrewZhaoLuo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think it looks fine. Thanks for fixing this.

Please add a comment to indicate to people why things are gated on version revision.

tests/python/frontend/keras/test_forward.py Outdated Show resolved Hide resolved
Copy link
Contributor

@echuraev echuraev left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Thank you for this PR and fixing this issue.

@echuraev
Copy link
Contributor

@tvm-bot rerun

3 similar comments
@echuraev
Copy link
Contributor

@tvm-bot rerun

@echuraev
Copy link
Contributor

@tvm-bot rerun

@echuraev
Copy link
Contributor

@tvm-bot rerun

@echuraev echuraev merged commit f9ae487 into apache:main Apr 28, 2023
@jikechao jikechao deleted the main branch May 11, 2023 03:16
@jikechao jikechao restored the main branch May 22, 2023 04:45
@jikechao jikechao deleted the main branch May 22, 2023 04:45
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants