Skip to content

Commit

Permalink
Correct alignment in the seq_len diagram.
Browse files Browse the repository at this point in the history
  • Loading branch information
Liqian Chen committed Jun 17, 2024
1 parent e2b85cf commit e9bd114
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion vllm/attention/backends/flash_attn.py
Original file line number Diff line number Diff line change
Expand Up @@ -83,7 +83,7 @@ class FlashAttentionMetadata(AttentionMetadata):
# |---------------- N iteration ---------------------|
# |- tokenA -|......................|-- newTokens ---|
# |---------- context_len ----------|
# |-------------------- seq_len ----------------------|
# |-------------------- seq_len ---------------------|
# |-- query_len ---|

# Maximum query length in the batch. None for decoding.
Expand Down

0 comments on commit e9bd114

Please sign in to comment.