-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
modified the python/paddle/amp/auto_cast.py to fix bug #55364
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
Sorry to inform you that ea34f38's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually. |
python/paddle/amp/auto_cast.py
Outdated
@@ -402,6 +404,7 @@ def amp_guard( | |||
if not enable: | |||
amp_level = AMP_LEVEL.O0 | |||
amp_dtype = "float32" | |||
# amp_global_state().amp_dtype = amp_dtype |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里注释记得删除
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已删除
python/paddle/amp/auto_cast.py
Outdated
@@ -477,6 +480,7 @@ def _set_multi_precision(optimizer, multi_precision): | |||
) | |||
|
|||
optimizer = ( | |||
# optimizer._inner_optimizer |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
这里注释删掉
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
已删除
PR types
Bug fixes
PR changes
APIs
Description
card-70458
修复“pr10119导致多个fp16模型运行失败”问题,修改了python/paddle/amp/auto_cast.py文件中的amp_decorate()和amp_guard()函数.