Skip to content

Commit

Permalink
comment
Browse files Browse the repository at this point in the history
  • Loading branch information
lucidrains authored Jul 27, 2022
1 parent e429d05 commit 665966e
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion flash_attention_jax/cosine_sim_flash_attention.py
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@

Q_CHUNK_SIZE = 1024
K_CHUNK_SIZE = 1024
COSINE_SIM_SCALE = 16
COSINE_SIM_SCALE = 16 # this may need to be a function of log(sequence length), but 16 was sufficient for 2048 and 4096 in my tests

# flash attention

Expand Down

0 comments on commit 665966e

Please sign in to comment.