Skip to content

Fix a bug in flash attention where kv_seq_len should divide block_k_major. #10961

Fix a bug in flash attention where kv_seq_len should divide block_k_major.

Fix a bug in flash attention where kv_seq_len should divide block_k_major. #10961

This workflow is awaiting approval from a maintainer in #8671
Triggered via pull request February 4, 2025 10:26
Status Action required
Total duration
Artifacts
This workflow is awaiting approval from a maintainer in #8671

build_and_test.yml

on: pull_request
get-torch-commit
get-torch-commit
Build PyTorch/XLA  /  build
Build PyTorch/XLA / build
Build docs  /  build-docs
Build docs / build-docs
TPU tests  /  tpu-test
TPU tests / tpu-test
Matrix: CPU tests / test
Waiting for pending jobs
Fit to window
Zoom out
Zoom in