Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[sp] : fix the attention kernel for sp #6064

Merged
merged 1 commit into from
Sep 18, 2024
Merged

Conversation

wangbluo
Copy link
Contributor

@wangbluo wangbluo commented Sep 16, 2024

🚨 Issue number

fixed #6061

📝 What does this PR do?

Add the lazyload condition for dao flash attention.

@wangbluo wangbluo requested a review from a team as a code owner September 16, 2024 05:46
@wangbluo wangbluo changed the title fix [sp] : fix the attention kernel for sp Sep 16, 2024
@wangbluo wangbluo enabled auto-merge September 16, 2024 09:08
@wangbluo wangbluo disabled auto-merge September 16, 2024 09:09
@wangbluo wangbluo enabled auto-merge September 16, 2024 09:09
@wangbluo wangbluo merged commit 63314ce into hpcaitech:main Sep 18, 2024
4 checks passed
@wangbluo wangbluo deleted the fix_attn branch September 26, 2024 10:05
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants