-
Notifications
You must be signed in to change notification settings - Fork 5.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Custom Extension] Fix custom double_grad backward=None #49224
[Custom Extension] Fix custom double_grad backward=None #49224
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
one comment
@@ -153,7 +153,8 @@ def custom_relu_double_grad_dynamic(func, device, dtype, np_x, use_func=True): | |||
dx = paddle.grad( | |||
outputs=[out], inputs=[t], create_graph=True, retain_graph=True | |||
) | |||
|
|||
if in_dygraph_mode(): |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe fix this test, since this test doesn't test double grad in the right way
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Already fix this, please review it again, thanks~
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM
…#49224) * fix custom double_grad backward=None * fix custom_relu.cu bug && polish testcase of double_grad * remove old dynamic graph test
* [Release2.4] Revert python link prs (#48573) * Revert "Fix mac link python (#48017)" This reverts commit 3fa7a73. * Revert "[Cherry-pick] Fix python link error (#47811)" This reverts commit ff642c6. * Update config.go * fix custom operator backward=None (#48656) * [Custom Extension] Fix custom double_grad backward=None (#49224) * fix custom double_grad backward=None * fix custom_relu.cu bug && polish testcase of double_grad * remove old dynamic graph test * add import fluid * add import fluid Co-authored-by: Chen Weihang <chenweihang@baidu.com>
PR types
Bug fixes
PR changes
Others
Describe
Pre-PR fixes custom op bug of backward: [Custom operator] Fix custom operator backward=None bug #48656
This PR fixes custom op bug of double backward, follows the same idea of pre-PR.
Other change:
double_grad_dynamic
, original test case cannot test eager mode.本 PR 修复了自定义算子二阶反向的 bug,遵循 [Custom operator] Fix custom operator backward=None bug #48656 相同的思路(前置 PR 修复了一阶反向的 bug)
其他改动:
double_grad_dynamic
测试用例,原来的单测无法访问到新动态图下的二阶反向 op