-
Notifications
You must be signed in to change notification settings - Fork 636
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
No operator found for this attention
when attn_bias
is a torch.Tensor
#576
Comments
It works if |
That's correct.
|
hmm. I'm using |
So this isn't possible with xformers then ? |
If not, then maybe updating the docs saying that |
I can see that FlashAttention has attention bias on their roadmap. Is that the case for xformers too ? |
Not at the moment. We have a few customers asking for it as well, it's on our radar, but likely won't happen until some time. XFormers used flash attention, depending on the input setting, so it will be supported in XFormers if/when flash implements it. |
Cool. Shall we keep this open then for tracking purposes until a PR is merged which fixes this? |
Sure - let me just rename it though |
No operator found for this attention
when attn_bias
is a torch.Tensor
@danthe3rd hi,Is there a solution to this problem please? |
I expect this to work once this PR is merged: |
my team has a similar use case and i've proposed a change, you might be interested in following this issue #640 |
🐛 Bug
I get the error
No operator found for this attention
Command
Run the code below
To Reproduce
Expected behavior
I expect this to work.
Environment
The text was updated successfully, but these errors were encountered: