Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix usage of unpad_input function #35925

Merged
merged 1 commit into from
Feb 6, 2025

Conversation

pavelgein
Copy link
Contributor

What does this PR do?

In the commit return type of unpad_input was changed. Now the code support older and newer versions

Fixes #35899

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@ArthurZucker

Copy link
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, but I'll let @ArthurZucker do the final review!

See huggingface#35899

In the [commit](Dao-AILab/flash-attention@cdbbe84) return type of `unpad_input` was changed.
Now the code support older and newer versions
@pavelgein
Copy link
Contributor Author

@ArthurZucker Hi, could please have a look?

@Rocketknight1
Copy link
Member

cc @Cyrilvallez for final review if Arthur is busy!

Copy link
Collaborator

@ArthurZucker ArthurZucker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the fix!

@ArthurZucker ArthurZucker merged commit ed98ad3 into huggingface:main Feb 6, 2025
23 checks passed
MekkCyber pushed a commit that referenced this pull request Feb 7, 2025
Fix usage of unpad_function

See #35899

In the [commit](Dao-AILab/flash-attention@cdbbe84) return type of `unpad_input` was changed.
Now the code support older and newer versions

Co-authored-by: Pavel Gein <pavel.gein@gmail.com>
elvircrn pushed a commit to elvircrn/transformers that referenced this pull request Feb 13, 2025
Fix usage of unpad_function

See huggingface#35899

In the [commit](Dao-AILab/flash-attention@cdbbe84) return type of `unpad_input` was changed.
Now the code support older and newer versions

Co-authored-by: Pavel Gein <pavel.gein@gmail.com>
sbucaille pushed a commit to sbucaille/transformers that referenced this pull request Feb 16, 2025
Fix usage of unpad_function

See huggingface#35899

In the [commit](Dao-AILab/flash-attention@cdbbe84) return type of `unpad_input` was changed.
Now the code support older and newer versions

Co-authored-by: Pavel Gein <pavel.gein@gmail.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Transformers is incompatible with flash attention version 2.7.3
3 participants