You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
No operator found for memory_efficient_attention_forward with inputs:
query : shape=(5, 4096, 1, 64) (torch.float16)
key : shape=(5, 4096, 1, 64) (torch.float16)
value : shape=(5, 4096, 1, 64) (torch.float16)
attn_bias :
p : 0.0 decoderF is not supported because:
xFormers wasn't build with CUDA support
attn_bias type is
operator wasn't built - see python -m xformers.info for more info flshattF@0.0.0 is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info tritonflashattF is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info
triton is not available cutlassF is not supported because:
xFormers wasn't build with CUDA support
operator wasn't built - see python -m xformers.info for more info smallkF is not supported because:
max(query.shape[-1] != value.shape[-1]) > 32
xFormers wasn't build with CUDA support
dtype=torch.float16 (supported: {torch.float32})
operator wasn't built - see python -m xformers.info for more info
unsupported embed per head: 64
What should I do? I don't have much experience with python or installing things via console, only basic comands from git and such, so I don't understand anything from this error 🥲
The text was updated successfully, but these errors were encountered:
Error occurred when executing CCSR_Upscale:
No operator found for
memory_efficient_attention_forward
with inputs:query : shape=(5, 4096, 1, 64) (torch.float16)
key : shape=(5, 4096, 1, 64) (torch.float16)
value : shape=(5, 4096, 1, 64) (torch.float16)
attn_bias :
p : 0.0
decoderF
is not supported because:xFormers wasn't build with CUDA support
attn_bias type is
operator wasn't built - see
python -m xformers.info
for more infoflshattF@0.0.0
is not supported because:xFormers wasn't build with CUDA support
operator wasn't built - see
python -m xformers.info
for more infotritonflashattF
is not supported because:xFormers wasn't build with CUDA support
operator wasn't built - see
python -m xformers.info
for more infotriton is not available
cutlassF
is not supported because:xFormers wasn't build with CUDA support
operator wasn't built - see
python -m xformers.info
for more infosmallkF
is not supported because:max(query.shape[-1] != value.shape[-1]) > 32
xFormers wasn't build with CUDA support
dtype=torch.float16 (supported: {torch.float32})
operator wasn't built - see
python -m xformers.info
for more infounsupported embed per head: 64
What should I do? I don't have much experience with python or installing things via console, only basic comands from git and such, so I don't understand anything from this error 🥲
The text was updated successfully, but these errors were encountered: