Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add AVX_VNNI support for intel x86 processors #4301

Closed
3 tasks done
freebie1101 opened this issue Dec 3, 2023 · 6 comments
Closed
3 tasks done

Add AVX_VNNI support for intel x86 processors #4301

freebie1101 opened this issue Dec 3, 2023 · 6 comments
Labels
enhancement New feature or request stale

Comments

@freebie1101
Copy link

Prerequisites

Please answer the following questions for yourself before submitting an issue.

  • I am running the latest code. Development is very rapid so there are no tagged versions as of now.
  • I carefully followed the README.md.
  • [x ] I searched using keywords relevant to my issue to make sure that I am creating a new issue that is not already open (or closed).
  • I reviewed the Discussions, and have a new bug or useful enhancement to share.

Feature Description

Enable AVX_VNNI ( we already have AVX512 ). Enable support for AVX_VNNI will make intel processor able to use int8 computation.

Motivation

Intel x86 processors are still very popular among consumer and data centers hardwares. Enable AVX_VNNI will make the performance of llama cpp on these processors better.

@freebie1101 freebie1101 added the enhancement New feature or request label Dec 3, 2023
@riverzhou
Copy link

Same question.

@sorasoras
Copy link

I build windows rocm build of llamacpp, it does support VNNI.

@tikikun
Copy link
Contributor

tikikun commented Dec 4, 2023

I build windows rocm build of llamacpp, it does support VNNI.

why rocm support AVX_VNNI ??

@ernesst
Copy link

ernesst commented Dec 21, 2023

@tikikun
Copy link
Contributor

tikikun commented Dec 22, 2023

^ Based on this comment i have implemented this PR
#4589

Copy link
Contributor

github-actions bot commented Apr 3, 2024

This issue was closed because it has been inactive for 14 days since being marked as stale.

@github-actions github-actions bot closed this as completed Apr 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request stale
Projects
None yet
Development

No branches or pull requests

5 participants