Skip to content

Commit

Permalink
build(deps): bump flash-attn from 2.6.1 to 2.6.3
Browse files Browse the repository at this point in the history
Bumps [flash-attn](https://github.com/Dao-AILab/flash-attention) from 2.6.1 to 2.6.3.
- [Release notes](https://github.com/Dao-AILab/flash-attention/releases)
- [Commits](Dao-AILab/flash-attention@v2.6.1...v2.6.3)

---
updated-dependencies:
- dependency-name: flash-attn
  dependency-type: direct:production
  update-type: version-update:semver-patch
...

Signed-off-by: dependabot[bot] <support@github.com>
  • Loading branch information
dependabot[bot] authored and dtrifiro committed Aug 7, 2024
1 parent 47869cc commit b1af6ec
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion pyproject.toml
Original file line number Diff line number Diff line change
Expand Up @@ -64,7 +64,7 @@ dev = [
flash_attn = [
# it's easier to install flash-attn from wheel rather than like this as extra
# "https://github.com/Dao-AILab/flash-attention/releases/download/v2.5.6/flash_attn-2.5.6+cu118torch2.0cxx11abiFALSE-cp311-cp311-linux_x86_64.whl",
"flash-attn==2.6.1",
"flash-attn==2.6.3",
"packaging", # FIXME: temporary, until https://github.com/Dao-AILab/flash-attention/pull/937 is released
"ninja"
]
Expand Down

0 comments on commit b1af6ec

Please sign in to comment.