Skip to content

Releases: lucidrains/flash-attention-jax

0.3.1

18 Jul 02:44
Compare
Choose a tag to compare
small simplification

0.3.0

18 Jun 17:52
Compare
Choose a tag to compare
make sure causal flash attention can have batch and head dimensions

0.2.0

01 Nov 18:08
Compare
Choose a tag to compare
0.2.0

0.1.0

29 Sep 22:13
Compare
Choose a tag to compare
release 0.1.0

0.0.10

21 Sep 16:29
Compare
Choose a tag to compare
add batch and multihead support for non-causal flash attention, for @…

0.0.9

21 Aug 00:38
Compare
Choose a tag to compare
fix cosine sim attention precision

0.0.8

20 Aug 17:15
Compare
Choose a tag to compare
fix for cosine sim attention

0.0.7

27 Jul 17:29
Compare
Choose a tag to compare
offer cosine sim flash attention variant

0.0.6

23 Jul 18:07
771422b
Compare
Choose a tag to compare
0.0.6

v0.0.5

22 Jul 22:25
Compare
Choose a tag to compare
make sure causal flash attention works even if key and query lengths …