-
Notifications
You must be signed in to change notification settings - Fork 1.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects #875
Comments
Same, but for a custom build to run on AWS Linux. torch.version = 2.3.0a0+git26431db Everything else otherwise works, I just can't get exllamav2 to use flash_attn even if simply installing from pip (non-source install). Was hoping build from source would fix the issue. |
Same error, temporally fix using shutil.move(wheel_filename, wheel_path) instead os.rename(src, dst) in setup.py as mentioned @CliuGeek9229 in #598 (comment)_ |
Building wheels for collected packages: flash_attn
Building wheel for flash_attn (setup.py) ... error
error: subprocess-exited-with-error
× python setup.py bdist_wheel did not run successfully.
│ exit code: 1
╰─> [19 lines of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
ERROR: Failed building wheel for flash_attn
Running setup.py clean for flash_attn
Failed to build flash_attn
ERROR: Could not build wheels for flash_attn, which is required to install pyproject.toml-based projects
The text was updated successfully, but these errors were encountered: