Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[XPU] add squared_l2_norm op #59066

Merged
merged 3 commits into from
Nov 17, 2023

Conversation

houj04
Copy link
Contributor

@houj04 houj04 commented Nov 16, 2023

PR types

New features

PR changes

OPs

Description

增加XPU下面的squared_l2_norm算子及其反向squared_l2_norm_grad

注1:在现有版本的python/paddle/nn/clip.py中,当使用的设备是XPU时,会走“基础算子实现”。本PR加了squared_l2_norm算子,因此可以去掉这个单独的if语句。

注2:最近XDNN中刚刚增加了squared_l2_norm算子的bfloat16实现,但是暂时还没有稳定产出,因此本PR只支持float和float16两种类型。后续再增加bfloat16类型。

namespace phi {

template <typename T, typename Context>

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里不知道为什么多出来一个空行,下个PR删掉它。

Copy link
Contributor

@runzhech runzhech left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@houj04 houj04 merged commit 98907f9 into PaddlePaddle:develop Nov 17, 2023
28 checks passed
SecretXV pushed a commit to SecretXV/Paddle that referenced this pull request Nov 28, 2023
* [XPU] add squared_l2_norm op

* fix copyright

* add squared_l2_norm_grad.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants