Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[XPU] add some bf16 ops for kl3 #59505

Merged
merged 1 commit into from
Dec 1, 2023

Conversation

houj04
Copy link
Contributor

@houj04 houj04 commented Nov 29, 2023

PR types

New features

PR changes

OPs

Description

在跑某个模型训练的时候,遇到了一批需要bfloat16类型的算子,本PR添加了kl3下面的一批注册。
顺便还有一些算子需要注册bool或者double类型的,一并添加了。
本地试过,开-DWITH_XPU_PLUGIN=ON也能编过。

TODO:

  • 还有一些算子没有绑定,其中有一些是XDNN缺实现,需要补。
  • 并非所有单测都能通过,需要修。
  • 通过的单测里面并非所有都是符合预期运行的,如:bf16的情况下,喂进来的类型是uint16,有可能输入数据实际都是0;某些算子单测没有真正执行过所有支持的数据类型,等等。

Copy link
Contributor

@ZibinGuo ZibinGuo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@QingshuChen QingshuChen merged commit b10c4ed into PaddlePaddle:develop Dec 1, 2023
30 checks passed
@houj04 houj04 added the XPU label Sep 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants