Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ETHOSN] Per-tensor support for int8 operations #10018

Merged
merged 1 commit into from
Jan 31, 2022

Conversation

leo-blonk
Copy link
Contributor

Per-axis quantization to follow

* Per-axis quantization to follow
Copy link
Contributor

@manupak manupak left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@manupak manupak merged commit 3de25b8 into apache:main Jan 31, 2022
@manupak
Copy link
Contributor

manupak commented Jan 31, 2022

Thanks @Leo-arm !

@leo-blonk leo-blonk deleted the int8-on-main branch January 31, 2022 13:24
ylc pushed a commit to ylc/tvm that referenced this pull request Feb 16, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants