Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

【Hackathon No57】add_fp16_bf16_for_dot & bf16_for_cross #52426

Merged
merged 6 commits into from
Apr 13, 2023

Conversation

Difers
Copy link
Contributor

@Difers Difers commented Mar 31, 2023

PR types

Others

PR changes

APIs

Describe

为dot 注册FP16,BF16, 完善单测
为cross 注册BF16, 完善单测

@paddle-bot
Copy link

paddle-bot bot commented Mar 31, 2023

你的PR提交成功,感谢你对开源项目的贡献!
请关注后续CI自动化测试结果,详情请参考Paddle-CI手册
Your PR has been submitted. Thanks for your contribution!
Please wait for the result of CI firstly. See Paddle CI Manual for details.

@paddle-bot paddle-bot bot added contributor External developers status: proposed labels Mar 31, 2023
@Difers Difers force-pushed the add_fp_bf_for_dot_cross branch 2 times, most recently from f37a82c to ec50629 Compare April 1, 2023 06:08
self.python_api = paddle.cross
self.initTestCase()
self.inputs = {
'X': np.random.random(self.shape).astype(self.dtype),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BF16类型的输入和输出需要调用convert_float_to_uint16函数做类型转换

for i in range(1024):
z_list.append(np.cross(x[i], y[i]))
self.outputs = {
'Out': np.array(z_list).astype(np.float32).reshape(self.shape)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

BF16类型的输入和输出需要调用convert_float_to_uint16函数做类型转换

'X': convert_float_to_uint16(self.x),
'Y': convert_float_to_uint16(self.y),
}
self.outputs = {'Out': self.out}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

输出也需要调用convert_float_to_uint16进行类型转换

if core.is_compiled_with_cuda():
place = core.CUDAPlace(0)
if core.is_bfloat16_supported(place):
self.check_grad_with_place(place, ['X', 'Y'], 'Out')
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

按照数学期望来计算,这里的前向atol可能需要设置为0.5

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

done,辛苦再review下

@Difers Difers force-pushed the add_fp_bf_for_dot_cross branch from ec50629 to 72d731f Compare April 3, 2023 11:58
@Vvsmile
Copy link
Contributor

Vvsmile commented Apr 6, 2023

LGTM

)

def init_input_output(self):
self.x = np.random.uniform(0.1, 1, [121]).astype(self.dtype)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.dtype需要改成np.float32,下同

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这里改动的时候发现,dot的bf16计算反向时,过程中经过fp32转uint16,又经过fp16转32,观察其数值在100+的时候小数点后面精度损失有的比较严重(具体是在计算get_numeric_gradient,y_pos,y_neg因精度损失相同了,导致数值微分是0),因此后续改动选择采用了user_defined_grads

self.initTestCase()
self.inputs = {
'X': convert_float_to_uint16(
np.random.random(self.shape).astype(self.dtype)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

self.dtype需要改成np.float32,下同

@Difers Difers force-pushed the add_fp_bf_for_dot_cross branch from 72d731f to fd01944 Compare April 7, 2023 08:54
place,
['X', 'Y'],
'Out',
user_defined_grads=[self.y / 11.0, self.x / 11.0],
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

可以用user_defined_grads,但是不要用11.0这种magic number,用shape这种参数来代替

@Difers Difers force-pushed the add_fp_bf_for_dot_cross branch from fd01944 to 8bb286d Compare April 10, 2023 06:18
@Difers
Copy link
Contributor Author

Difers commented Apr 11, 2023

@ZzSean done,辛苦再review下~

Copy link
Contributor

@ZzSean ZzSean left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@ZzSean ZzSean merged commit 205094f into PaddlePaddle:develop Apr 13, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
contributor External developers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants