Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix mixed precision output type to original type #11142

Merged
merged 6 commits into from
May 5, 2022

Conversation

gayatripk1
Copy link
Contributor

Thanks for contributing to TVM! Please refer to guideline https://tvm.apache.org/docs/contribute/ for useful information and tips. After the pull request is submitted, please request code reviews from Reviewers by @ them in the pull request thread.

@gayatripk1
Copy link
Contributor Author

cc @AndrewZhaoLuo @kparzysz-quic

@yangulei
Copy link
Contributor

Hi, does this mean the output OP has different dtype for inputs, weights and dst after AMP? If so, I'm afraid that my bfloat16 support for DNNL BYOC(#11111) might conflict with this. I implement bfloat16 support with the assumption that the dtype for the inputs, weights and dst are either all float32 or all bfloat16.

@comaniac
Copy link
Contributor

I think this PR is for the model output dtype only so it shouldn't affect other cases.

In addition to that, I'd suggest 1) exposing this behavior to users by adding a configure to PassContext, and 2) adding a unit test.

@yangulei
Copy link
Contributor

I think this PR is for the model output dtype only so it shouldn't affect other cases.

OK, thanks for your clarification.

@AndrewZhaoLuo AndrewZhaoLuo self-assigned this Apr 28, 2022
@AndrewZhaoLuo
Copy link
Contributor

Sorry ive been busy, ill take a look tomorrow

@gayatripk1
Copy link
Contributor Author

@AndrewZhaoLuo @comaniac, Modified this behavior to users by adding a configure to PassContext, and adding a unit test.

Copy link
Contributor

@comaniac comaniac left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Just nits.

Copy link
Contributor

@AndrewZhaoLuo AndrewZhaoLuo left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, thanks! Please fix CI though

@AndrewZhaoLuo
Copy link
Contributor

Looks like CI is flaky, please push empty commit to restart CI. @gayatripk1

@gayatripk1
Copy link
Contributor Author

Looks like CI is flaky, please push empty commit to restart CI. @gayatripk1

Done

@gayatripk1
Copy link
Contributor Author

waiting for more reviews?

@comaniac comaniac merged commit eae836c into apache:main May 5, 2022
@comaniac
Copy link
Contributor

comaniac commented May 5, 2022

Thanks @gayatripk1 @AndrewZhaoLuo

shtinsa pushed a commit to Deelvin/tvm that referenced this pull request May 17, 2022
SebastianBoblest pushed a commit to SebastianBoblest/tvm that referenced this pull request May 27, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants