Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

为 amp.decorate 新增 master_grad、excluded_layers参数 #5792

Merged
merged 5 commits into from
Apr 18, 2023

Conversation

zhangting2020
Copy link
Contributor

@zhangting2020 zhangting2020 commented Apr 10, 2023

为 amp.decorate 新增 master_grad参数
image

@paddle-bot
Copy link

paddle-bot bot commented Apr 10, 2023

感谢你贡献飞桨文档,文档预览构建中,Docs-New 跑完后即可预览,预览链接:http://preview-pr-5792.paddle-docs-preview.paddlepaddle.org.cn/documentation/docs/zh/api/index_cn.html
预览工具的更多说明,请参考:飞桨文档预览工具

@paddle-bot
Copy link

paddle-bot bot commented Apr 10, 2023

❌ The PR's message can't be empty.

@@ -20,6 +20,7 @@ decorate
- **dtype** (str,可选) - 混合精度训练数据类型使用 float16 还是 bfloat16,默认为 float16 类型。
- **master_weight** (bool|None,可选) - 是否使用 master weight 策略。支持 master weight 策略的优化器包括 ``adam``、``adamW``、``momentum``,默认值为 None,在 ``O2`` 模式下使用 master weight 策略。
- **save_dtype** (str|None,可选) - 网络存储类型,可为 float16、bfloat16、float32、float64。通过 ``save_dtype`` 可指定通过 ``paddle.save`` 和 ``paddle.jit.save`` 存储的网络参数数据类型。默认为 None,采用现有网络参数类型进行存储。
- **master_grad** (bool, 可选) - 在 ``O2`` 模式下是否使用 float32 类型的权重梯度进行梯度裁剪、权重衰减、权重更新等计算。如果被启用,在反向传播结束后权重的梯度将会是 float32 类型。默认值:False。
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

补充一下默认值的含义

@zhangting2020 zhangting2020 changed the title 为 amp.decorate 新增 master_grad参数 为 amp.decorate 新增 master_grad、excluded_layers参数 Apr 18, 2023
Copy link
Collaborator

@sunzhongkai588 sunzhongkai588 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@zhangting2020 zhangting2020 merged commit dd744e2 into PaddlePaddle:develop Apr 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants