Skip to content

[Pallas] Allow setting FlashAttention's causal mask #6480

[Pallas] Allow setting FlashAttention's causal mask

[Pallas] Allow setting FlashAttention's causal mask #6480

Annotations

1 warning

GPU tests  /  test (python_tests, torch_mp_op)

succeeded Mar 21, 2024 in 2h 5m 58s