-
-
Notifications
You must be signed in to change notification settings - Fork 5.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Installation]: vLLM build from source errors #8532
Comments
Error Messages:
|
looks like some cutlass error? cc @tlrmchlsmth |
the user actually already has cutlass. maybe this caused some conflict? |
Thanks for providing some quick insignts here. Added some question, could it be due to these lines here in the CMakeLists.txt file?
|
I haven’t seen this before, but looks like something might be going wrong with FetchContent. @Imss27 is your machine connected to the internet? it needs to clone cutlass during the build process. Also try commenting out the GIT_SHALLOW call just in case it is causing problems. @youkaichao I don’t think cutlass already being there is the thing that’s causing this issue |
@tlrmchlsmth Thank you. Yes my computer is connecting to the internet. I tried to ping and curl as following. Could this be a possible reason?
|
Can you try and rule out compiler out-of-memory? Either check |
It's a git repo (https://github.com/NVIDIA/cutlass) . You should be able to |
Still not sure what's going on here though. Does deleting the following help? (We've run into trouble with it previously, hence the long comment)
|
might be caused by WSL? |
Thank you all for the great suggestions and insights! @youkaichao @tlrmchlsmth @zifeitong Previously I tried to set For future reference, using WSL, if encountering similar issues, try conservative approaches like
(Note this will take an extremely long time to build) This solved the build issue that I encountered even still with CUTLASS git error:
|
is there any solution aside from setting |
@yxchng you should check https://docs.vllm.ai/en/latest/getting_started/installation.html#python-only-build-without-compilation , then you don't need to compile anything. |
@youkaichao I want to use the latest commit #10584 that resolved the full context length problem of gemma. Does this only require python-only-build? |
then you don't even need to build anything, just use https://docs.vllm.ai/en/latest/getting_started/installation.html#install-the-latest-code :
|
Your current environment
How you are installing vllm
pip install -e .
Before submitting a new issue...
The text was updated successfully, but these errors were encountered: