Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AMP OP&Test] where support bf16/fp16 #51137

Merged
merged 13 commits into from
Mar 9, 2023
Merged

[AMP OP&Test] where support bf16/fp16 #51137

merged 13 commits into from
Mar 9, 2023

Conversation

yangjianfengo1
Copy link
Contributor

PR types

Others

PR changes

OPs

Describe

增加了bfloat16和float16单测

or not core.is_bfloat16_supported(core.CUDAPlace(0)),
"core is not complied with CUDA and not support the bfloat16",
)
class TestWhereOpBFloat16(OpTest):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

修改一些单测的类名,在规范里给了格式

@@ -50,6 +50,50 @@ def init_config(self):
self.cond = np.ones((60, 2)).astype('bool')


class TestWhereOpFloat16(TestWhereOp):
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

修改一些单测的类名,在规范里给了格式

@yangjianfengo1 yangjianfengo1 changed the title AMP where_op & test [AMP OP&Where] Mar 8, 2023
@ZzSean ZzSean changed the title [AMP OP&Where] [AMP OP&Test] where support bf16/fp16 Mar 9, 2023
Copy link
Contributor

@ZzSean ZzSean left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

Copy link
Contributor

@qili93 qili93 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM for @unittest.skip

@qili93 qili93 merged commit 2727ddd into PaddlePaddle:develop Mar 9, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants