Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Misc]: More patch versions #9200

Closed
1 task done
mikelfried opened this issue Oct 9, 2024 · 6 comments
Closed
1 task done

[Misc]: More patch versions #9200

mikelfried opened this issue Oct 9, 2024 · 6 comments
Labels

Comments

@mikelfried
Copy link

mikelfried commented Oct 9, 2024

Anything you want to discuss about vllm.

Hi team,

It seems that vllm version 0.6.2 is having some issues when used with models like Qwen-VL-2, Pixtral, and others (#9068, #9091...).
Most of the bugs have already been fixed in the current dev branch, but they haven’t been included in a new version yet, which makes it a bit tricky to use in a production environment.

I'm really grateful for all the hard work you're putting into this project! It would be awesome if there could be more frequent version releases, especially patch versions that address bug fixes.

Thanks a lot!
Michael

Before submitting a new issue...

  • Make sure you already searched for relevant issues, and asked the chatbot living at the bottom right corner of the documentation page, which can answer lots of frequently asked questions.
@mikelfried mikelfried added the misc label Oct 9, 2024
@simon-mo
Copy link
Collaborator

simon-mo commented Oct 9, 2024

We aim to have release every two weeks but sometimes blocked by major features or model releases. We believe our nightly versions should be fairly stable, which that suffice?

@russellb
Copy link
Collaborator

russellb commented Oct 9, 2024

You can find instructions on grabbing the nightly releases in a Note on this page: https://docs.vllm.ai/en/stable/getting_started/installation.html

@mikelfried
Copy link
Author

Oh, I just noticed the release of the nightly versions—thanks for that! I suppose there aren’t any nightly Docker images available, right?

@russellb
Copy link
Collaborator

russellb commented Oct 9, 2024

Oh, I just noticed the release of the nightly versions—thanks for that! I suppose there aren’t any nightly Docker images available, right?

right - I haven't seen any

@ycool
Copy link
Contributor

ycool commented Oct 10, 2024

Oh, I just noticed the release of the nightly versions—thanks for that! I suppose there aren’t any nightly Docker images available, right?

export VLLM_COMMIT=07c11cf4d4b9a913fa52142fe134849f1e25e393 # use full commit hash from the main branch
docker pull public.ecr.aws/q9t5s3a7/vllm-ci-test-repo:${VLLM_COMMIT}

@mikelfried
Copy link
Author

@ycool Thank you so much! Really appreciate it!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

4 participants