-
Notifications
You must be signed in to change notification settings - Fork 486
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add RoCBert support for Bettertransformer #542
Add RoCBert support for Bettertransformer #542
Conversation
The documentation is not available anymore as the PR was closed or merged. |
I got the following error but I didn't get it.... https://github.com/huggingface/optimum/actions/runs/3608885843/jobs/6081897113
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks so much for adding BetterTransformer
support for this new architecture!
SInce ROCBert
copies exactly the same structure as Bert
, we can leverage that and use directly BertLayerBetterTransformer
as the converted layer :D
Regarding the failing test, let me dig a little bit and get back to you
Hi @younesbelkada
|
Hi @shogohida |
There is another error that I don't understand...
|
@shogohida Thanks for the work! This is a flaky test, fixed in #564 . I rerun the workflow and it should be fine. |
Yet an other test failing:
|
This failing test has to do with autocast and ROCBert. I think that the fix should go on transformers side or we can just skip this test for RocBert.. |
Thanks for your comments guys! So what should I do?.... Can we skip the failing test as Younes said? |
Hi @shogohida |
@younesbelkada I'll let you know if I get stuck! |
This is correct, please proceed as suggested ;-) |
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
74af5d4
to
92f024f
Compare
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
@younesbelkada |
Hi @shogohida , thanks for working on it! Maybe class BetterTransformersRoCBertTest(BetterTransformersEncoderTest):
all_models_to_test = ["path-to-tiny-rocbert-model-here"]
# unrelated issue with torch.amp.autocast with rocbert (expected scalar type BFloat16 but found Float)
def test_raise_autocast(self):
pass would work |
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
Still facing the same error.... https://github.com/huggingface/optimum/actions/runs/3882552677/jobs/6630181175
|
Signed-off-by: Shogo Hida <shogo.hida@gmail.com>
Thank you for your contribution! |
Thanks for your review! It took time but it was merged in the end... This was my first issue so I hope to contribute more! |
What does this PR do?
Adds RoCBert support for Bettertransformer
Fixes huggingface/transformers#20372
Before submitting