Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix nvcc not found in vllm-openai image #2781

Merged
merged 1 commit into from
Feb 22, 2024

Conversation

zhaoyang-star
Copy link
Contributor

Fix #2778

@zhaoyang-star zhaoyang-star changed the title Fix nvcc not found in vlm-openai image Fix nvcc not found in vllm-openai image Feb 6, 2024
@zhaoyang-star zhaoyang-star marked this pull request as ready for review February 6, 2024 06:33
@zhuohan123 zhuohan123 requested a review from simon-mo February 19, 2024 05:09
@zhuohan123
Copy link
Member

@simon-mo Could you help take a look into this?

@simon-mo simon-mo merged commit 57f0449 into vllm-project:main Feb 22, 2024
17 checks passed
@zhaoyang-star zhaoyang-star deleted the fix_nvcc branch February 23, 2024 00:43
xjpang pushed a commit to xjpang/vllm that referenced this pull request Mar 4, 2024
@chiragjn
Copy link
Contributor

This seems to have regressed on 0.4.1
Just curious do we really need to check nvcc version or just cuda runtime version?

@youkaichao
Copy link
Member

Just curious do we really need to check nvcc version or just cuda runtime version?

nvcc is required for compile time. It is not needed during runtime. I think it is not a problem if vllm-openai image has no nvcc.

@chiragjn
Copy link
Contributor

Then --kv_cache_dtype fp8 should not check for nvcc
currently it is not usable in 0.4.1 openai image

@zhaoyang-star
Copy link
Contributor Author

. I think it is not a problem if vllm-openai image has no nvcc.

Yes. We could disable the cuda version check.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Error with docker container vllm/vllm-openai:v0.3.0 when --kv-cache-dtype=fp8_e5m2
5 participants