-
-
Notifications
You must be signed in to change notification settings - Fork 5.4k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Adding flash attention to one click installer #4015
Comments
I agree, considering the instructions to get flash attention working are vague af and assume that the user has years of tech school and are computer gurus. |
Flash-attention 2 doesn't works for now on Windows. I have trying building it but no luck so far. |
Hey! I recognize Panchovix. If he can't get it to build imma give up right now. I hope someone with a PhD in Python bullshit saves the day. |
2023-09-30 12:29:14 WARNING:You are running ExLlamaV2 without flash-attention. This will cause the VRAM usage to be a lot higher than it could be. Does this Manual work ? on windows? |
I'm running Windows and this is the manual I was using for installing Flash Attention 2, after having it complain about my Cuda version not matching my Pytorch Cuda version, I installed the correct one (11.7 for me) and uninstalled Cuda 12, I ran into a different error:
the alternative it gives of cloning the repo and running the setup file results in a very similar error
right now it seems it just isn't ready for windows |
Maybe Oobabooga should either suppress this message, or add a clarification, at least on Windows? |
Yeah, it doesn't work for windows right now. Only for Linux (macOS) I think. We'll have to see who is earlier. Flash attention supporting windows or Oobabooga giving an statment. |
I managed to build it on Windows. But, you will need CUDA 12.1 and torch+cu121, else it won't compile. More info Dao-AILab/flash-attention#595 |
See #4235 |
This issue has been closed due to inactivity for 6 weeks. If you believe it is still relevant, please leave a comment below. You can tag a developer in your comment. |
Description
Adding flash attention to one click installer, for usage with exllamaV2
Additional Context
Me and others not so tech savvy people are having issues installing it manually on windows
The text was updated successfully, but these errors were encountered: