Skip to content

Releases: lucidrains/x-transformers

0.4.2

28 Dec 03:25
Compare
Choose a tag to compare
fix bug with residual attention

0.4.1

28 Dec 03:18
Compare
Choose a tag to compare
for post-normalization, let wrapper take care of last normalization

0.4.0

28 Dec 00:10
Compare
Choose a tag to compare
add residual attention, from Realformer paper

0.3.5

15 Dec 00:21
Compare
Choose a tag to compare
do GLU gating for attention layer output, without queries

0.3.4

14 Dec 07:35
Compare
Choose a tag to compare
create a cross attention only attention layer (CrossAttender)

0.3.3

14 Dec 07:29
Compare
Choose a tag to compare
allow for only cross attention in attention layers

0.3.2

14 Dec 07:04
Compare
Choose a tag to compare
allow encoder to cross attend

0.3.1

13 Dec 02:56
Compare
Choose a tag to compare
default to topk sampling

0.3.0

11 Dec 23:52
Compare
Choose a tag to compare
bump version

0.2.5

08 Dec 18:33
Compare
Choose a tag to compare
fix talking heads