Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects #875

Open
lvsh2012 opened this issue Mar 7, 2024 · 2 comments

Comments

@lvsh2012
Copy link

lvsh2012 commented Mar 7, 2024

Building wheels for collected packages: flash_attn
Building wheel for flash_attn (setup.py) ... error
error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [19 lines of output]

  torch.__version__  = 2.2.1+cu121


  /mnt/conda/envs/qwen/lib/python3.10/site-packages/setuptools/__init__.py:80: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated.
  !!

          ********************************************************************************
          Requirements should be satisfied by a PEP 517 installer.
          If you are using pip, you can try `pip install --use-pep517`.
          ********************************************************************************

  !!
    dist.fetch_build_eggs(dist.setup_requires)
  running bdist_wheel
  Guessing wheel URL:  https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
  Raw wheel path /tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl
  error: [Errno 18] Invalid cross-device link: 'flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl' -> '/tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl'
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for flash_attn
Running setup.py clean for flash_attn
Failed to build flash_attn
ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects

@elkay
Copy link

elkay commented Mar 7, 2024

Same, but for a custom build to run on AWS Linux.

torch.version = 2.3.0a0+git26431db

Everything else otherwise works, I just can't get exllamav2 to use flash_attn even if simply installing from pip (non-source install). Was hoping build from source would fix the issue.

@Revliter
Copy link

Building wheels for collected packages: flash_attn Building wheel for flash_attn (setup.py) ... error error: subprocess-exited-with-error

× python setup.py bdist_wheel did not run successfully. │ exit code: 1 ╰─> [19 lines of output]

  torch.__version__  = 2.2.1+cu121


  /mnt/conda/envs/qwen/lib/python3.10/site-packages/setuptools/__init__.py:80: _DeprecatedInstaller: setuptools.installer and fetch_build_eggs are deprecated.
  !!

          ********************************************************************************
          Requirements should be satisfied by a PEP 517 installer.
          If you are using pip, you can try `pip install --use-pep517`.
          ********************************************************************************

  !!
    dist.fetch_build_eggs(dist.setup_requires)
  running bdist_wheel
  Guessing wheel URL:  https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl
  Raw wheel path /tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl
  error: [Errno 18] Invalid cross-device link: 'flash_attn-2.5.6+cu122torch2.2cxx11abiFALSE-cp310-cp310-linux_x86_64.whl' -> '/tmp/pip-wheel-28o_d5yx/flash_attn-2.5.6-cp310-cp310-linux_x86_64.whl'
  [end of output]

note: This error originates from a subprocess, and is likely not a problem with pip. ERROR: Failed building wheel for flash_attn Running setup.py clean for flash_attn Failed to build flash_attn ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects

Same error, temporally fix using shutil.move(wheel_filename, wheel_path) instead os.rename(src, dst) in setup.py as mentioned @CliuGeek9229 in #598 (comment)_

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants