Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Installation]: M2 Mac Dependency Torch 2.1.2 (Incompatible) #5457

Closed
velocity33 opened this issue Jun 12, 2024 · 9 comments
Closed

[Installation]: M2 Mac Dependency Torch 2.1.2 (Incompatible) #5457

velocity33 opened this issue Jun 12, 2024 · 9 comments
Labels
installation Installation problems

Comments

@velocity33
Copy link

Your current environment

PyTorch version: 2.3.1
Is debug build: False
CUDA used to build PyTorch: None
ROCM used to build PyTorch: N/A

OS: macOS 14.1.1 (arm64)
GCC version: Could not collect
Clang version: 15.0.0 (clang-1500.0.40.1)
CMake version: version 3.29.5
Libc version: N/A

Python version: 3.12.3 | packaged by Anaconda, Inc. | (main, May 6 2024, 14:46:42) [Clang 14.0.6 ] (64-bit runtime)
Python platform: macOS-14.1.1-arm64-arm-64bit
Is CUDA available: False
CUDA runtime version: No CUDA
CUDA_MODULE_LOADING set to: N/A
GPU models and configuration: No CUDA
Nvidia driver version: No CUDA
cuDNN version: No CUDA
HIP runtime version: N/A
MIOpen runtime version: N/A
Is XNNPACK available: True

CPU:
Apple M2 Max

Versions of relevant libraries:
[pip3] numpy==1.26.4
[pip3] torch==2.3.1
[pip3] transformers==4.41.2
[pip3] transformers==4.41.2
[conda] numpy 1.26.4 pypi_0 pypi
[conda] torch 2.3.1 pypi_0 pypi
[conda] transformers 4.41.2 pypi_0 pypi
ROCM Version: Could not collect
Neuron SDK Version: N/A
vLLM Version: N/A
vLLM Build Flags:
CUDA Archs: Not Set; ROCm: Disabled; Neuron: Disabled
GPU Topology:
Could not collect

How you are installing vllm

git clone https://github.com/vllm-project/vllm.git
cd vllm
git fetch origin pull/2244/head:pr-2244
git checkout pr-2244

VLLM_TARGET_DEVICE=cpu python3 setup.py build
VLLM_TARGET_DEVICE=cpu python3 setup.py install
# Seems to give an error the first time.
# error: CUDA_HOME environment variable is not set. Please set it to your CUDA install root.

# I can't reproduce the error if I run it again. It does seem to run.
VLLM_TARGET_DEVICE=cpu python3 setup.py install

# note that the following error message occurs

ERROR: Could not find a version that satisfies the requirement torch==2.1.2 (from versions: 2.2.0, 2.2.1, 2.2.2, 2.3.0, 2.3.1)
ERROR: No matching distribution found for torch==2.1.2

# i have also found issues with installing triton so I have built it from source but similarly run into issues with "sentencepiece" requiring "ray" which requires "torch=2.1.2". 

I am installing vLLM to install SGLang.
@velocity33 velocity33 added the installation Installation problems label Jun 12, 2024
@DarkLight1337
Copy link
Member

DarkLight1337 commented Jun 12, 2024

According to the logs, you currently have torch=2.3.1 installed. The latest version of vLLM (v0.5.0) requires torch==2.3.0, so you'll have to downgrade PyTorch to that version.

Edit: I see that you're trying to use vLLM at #2244 specifically. In that case you can check the required PyTorch version in pyproject.toml.

@velocity33
Copy link
Author

According to the logs, you currently have torch=2.3.1 installed. The latest version of vLLM (v0.5.0) requires torch==2.3.0, so you'll have to downgrade PyTorch to that version.

Edit: I see that you're trying to use vLLM at #2244 specifically. In that case you can check the required PyTorch version in pyproject.toml.

Thank you for this suggestion. I have downgraded my python version to 3.9 and am working on the installation again currently. I will see if any other issues pop up.

@velocity33
Copy link
Author

velocity33 commented Jun 12, 2024

Following the downgrade I have encountered a new issue regarding packaging. Is this a problem with my environment?

Traceback (most recent call last):
File "/Users/velocity/Documents/Holder/Business/_____/vllm/setup.py", line 12, in
from packaging.version import parse, Version
ModuleNotFoundError: No module named 'packaging'

@DarkLight1337
Copy link
Member

DarkLight1337 commented Jun 12, 2024

Try downgrading setuptools to setuptools<70.0.0.

@velocity33
Copy link
Author

Running setuptools 69.0.3 within my conda environment now. Unfortunately, same error.

@DarkLight1337
Copy link
Member

Try updating your pip version

python -m pip install --upgrade pip

@velocity33
Copy link
Author

Try updating your pip version

python -m pip install --upgrade pip

Im running the latest pip version, 24.0. What is odd is that this was not an issue prior to downgrading python

@velocity33
Copy link
Author

velocity33 commented Jun 12, 2024

Try updating your pip version

python -m pip install --upgrade pip

Im running the latest pip version, 24.0. What is odd is that this was not an issue prior to downgrading python

Downgrading to setuptools==69.5.1 specifically solved this issue

@DarkLight1337
Copy link
Member

Odd that you had to use this specific version. In any case, glad that it has been solved.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
installation Installation problems
Projects
None yet
Development

No branches or pull requests

2 participants