Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add bitsandbytes fp4 support #7320

Closed
wants to merge 1 commit into from
Closed

Conversation

thesues
Copy link
Contributor

@thesues thesues commented Aug 9, 2024

bitsandbytes could support read bnb fp4 model such as PrunaAI/Einstein-v6.1-Llama3-8B-bnb-4bit-smashed.

I can not find any model which is tiny and fast. So I did not add new unit test.

this simple test works:

from vllm import LLM
model_id="PrunaAI/Einstein-v6.1-Llama3-8B-bnb-4bit-smashed"

llm = LLM(model=model_id, trust_remote_code=True, enforce_eager=True, quantization="bitsandbytes", \
          load_format="bitsandbytes",  max_model_len=2048)
outputs = llm.generate("What is the color of prunes?")
print(outputs[0].outputs[0].text)

output

INFO 08-09 00:51:04 model_runner.py:733] Loading model weights took 5.6168 GB
INFO 08-09 00:51:06 gpu_executor.py:102] # GPU blocks: 6598, # CPU blocks: 2048
Processed prompts: 100%|██████████████████████████████████████████████████| 1/1 [00:00<00:00,  1.60it/s, est. speed input: 12.82 toks/s, output: 25.64 toks/s]
 They are actually dark purple to dark brownish black, with reddish hues if

Copy link

github-actions bot commented Aug 9, 2024

👋 Hi! Thank you for contributing to the vLLM project.
Just a reminder: PRs would not trigger full CI run by default. Instead, it would only run fastcheck CI which consists a small and essential subset of CI tests to quickly catch errors. You can run other CI tests on top of default ones by unblocking the steps in your fast-check build on Buildkite UI.

Once the PR is approved and ready to go, please make sure to run full CI as it is required to merge (or just use auto-merge).

To run full CI, you can do one of these:

  • Comment /ready on the PR
  • Add ready label to the PR
  • Enable auto-merge.

🚀

Copy link
Collaborator

@mgoin mgoin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are there other possible types that bnb uses other than fp4 and nf4? What happens with an 8bit or int4 model?

@thesues
Copy link
Contributor Author

thesues commented Aug 14, 2024

close this pr, duplicated in #7445 which supports fp4

@thesues thesues closed this Aug 14, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants