-
Notifications
You must be signed in to change notification settings - Fork 5.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Semi-Auto] Add cross_entropy_with_softmax infer_backward rule #56507
Conversation
你的PR提交成功,感谢你对开源项目的贡献! |
Sorry to inform you that 516c727's CIs have passed for more than 7 days. To prevent PR conflicts, you need to re-run all CIs manually. |
516c727
to
3bf6fc8
Compare
std::string alphabet = | ||
"abcdefghijlmnopqrstuvwxyz"; // k for softmax_normalize axis | ||
std::string x_axes = GetBroadcastAxes(x_ndim, x_ndim, alphabet); | ||
x_axes[axis] = 'k'; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
should check if softmax_normalize is the last axis, if it is not, it could not be sharded.
and if softmax_normalize is the last axis,but if soft_label == true, it could not be sharded.
) # loss | ||
|
||
# GPT MP case, shard normalized axis | ||
# [-1, -1, 0], [-1, -1, -1] (outputs) --> |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
distinguish
outputs: loss (last axis is 1) & softmax_out
inputs: label(last axis maybe 1) & logits
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Update in Next PR
PR types
Function optimization
PR changes
Others
Description
Pcard-70448
Add infer_backward rule for cross_entropy_with_softmax to infer inputs' dims mappings from outputs'.