-
Notifications
You must be signed in to change notification settings - Fork 4.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LinearInt8 layer for inference of int8-quantized LLMs and Arm intrinsics #5007
Conversation
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #5007 +/- ##
===========================================
- Coverage 94.72% 89.52% -5.20%
===========================================
Files 772 303 -469
Lines 228777 89191 -139586
===========================================
- Hits 216705 79848 -136857
+ Misses 12072 9343 -2729 ☔ View full report in Codecov by Sentry. |
oh no, it didn't work |
Closing following @nihui's instructions. |
ae6ab89
to
307e635
Compare
2aadb06
to
68b4b2d
Compare
Please provide feedback on how the patch could be improved.