-
Notifications
You must be signed in to change notification settings - Fork 955
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add Ascend NPU accelerator support #1676
Conversation
The documentation is not available anymore as the PR was closed or merged. |
@sgugger Good day. Could you please review this PR |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is looking great! @muellerzr can you also have a second look and make sure all slow tests pass as well (we don't have a way to test on NPUs but want to make sure this doesn't break existing stuff).
@statelesshz can you solve the merge conflict please? :) Otherwise I'm running through the slow tests now, if those all pass and the merge conflict is resolved we're good! ✔️ Edit: can confirm that the tests pass, so let's go ahead and fix that merge conflict great job! |
29b4ca4
to
bed1578
Compare
@muellerzr Thanks for your reply, I have rebased my commits to master HEAD to resolve merge conflict |
The documentation is not available anymore as the PR was closed or merged. |
Great work! Thanks! |
What this PR do?
According to the review of the previous PR(see), if i want to use Ascend NPUs to train 🤗 Transformers models, the support should be added in Accelerate first and then will come in the Trainer for free.
This PR will support Ascend NPU accelerator:
nlp_example.py
with NPUs.time accelerate launch nlp_example.py
compare with A100
Below are the output logs:
Ascend NPU is a AI processor that support AI frameworks like PyTorch, TensorFlow, etc. So, i think its possible run Transformers/Accelerate on NPUs to train foundation model. Their website: https://www.hiascend.com/en/