Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BetterTransformer] Add MobileBERT support for BetterTransformer #506

Open
wants to merge 10 commits into
base: main
Choose a base branch
from

Conversation

raghavanone
Copy link

What does this PR do?

Adds mobilebertlayer

Copy link
Contributor

@younesbelkada younesbelkada left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks a lot for your great work @raghavanone and for start adding BetterTransformer support for MobileBert!
I left few comments, several points needs to be addressed before merging. Starting from the naming of the new class. We prefer to call the new class as xxxLayerBetterTransformer to be able to correctly trace whether the conversion has been successfull or not.
Also could you make sure this implementation is tested by adding hf-internal-testing/tiny-random-MobileBertModel here
Let us know if you face into any issue here!

optimum/bettertransformer/models/__init__.py Outdated Show resolved Hide resolved
optimum/bettertransformer/models/__init__.py Outdated Show resolved Hide resolved
optimum/bettertransformer/models/encoder_models.py Outdated Show resolved Hide resolved
optimum/bettertransformer/models/__init__.py Show resolved Hide resolved
@younesbelkada younesbelkada changed the title Add mobilebertlayer [BetterTransformer] Add MobileBERT support for BetterTransformer Nov 23, 2022
@younesbelkada
Copy link
Contributor

hey @raghavanone !
do you need any help converting the model? It seems that you did forget to add hf-internal-testing/tiny-random-MobileBertModel in the testing suite?

@raghavanone
Copy link
Author

hey @raghavanone ! do you need any help converting the model? It seems that you did forget to add hf-internal-testing/tiny-random-MobileBertModel in the testing suite?

Done


self.validate_bettertransformer()

def forward(self, hidden_states, attention_mask, *_):
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What is the purpose of *_ in the end of the arguments?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I was not sure, I saw *_ in every other forward call definition.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Alright, maybe we should also add **__ or something to handle potential keyword arguments?
@younesbelkada

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I saw both some forward def having both *_ and **_ . We should possibly add more information about it in the guide published.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes basically it is to "handle" potential arguments that were passed to the original forward function, by allowing this forward function to get them, but ignoring them right after.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

great remark, the guide will be updated ;)

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint.

@younesbelkada
Copy link
Contributor

younesbelkada commented Nov 24, 2022

The test is failing for few reasons, firstly MobileBERT uses a special mechanism called bottleneck_attention and botteneck, if these are enabled it seems that the conversion criteras are not met. Therefore I made a special model here: ybelkada/tiny-mobilebertmodel that you can replace on the testing suite and hopefully the test will pass!

also, it would be cool to add a sanity check on the __init__ of the MobileBertBetterTransformerLayer to check if these values are correctly set on the config file.

@raghavanone
Copy link
Author

@younesbelkada let me know what needs to be done to merge this .

@michaelbenayoun
Copy link
Member

There are conflicts, you need to fetch on the main branch and merge / rebase on this branch.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants