-
Notifications
You must be signed in to change notification settings - Fork 26.8k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
BertForSequenceClassification does not support 'device_map':"auto" yet #25296
Comments
Hi @goodaytar, thanks for raising this issue! Yes, the BERT models don't support the use of
In order to enable this the |
Thanks for the reply Amy. If you could give me a little bit more info on what needs adding, I'd be happy to.
Get Outlook for Android<https://aka.ms/AAb9ysg>
…________________________________
From: amyeroberts ***@***.***>
Sent: Friday, August 4, 2023 12:26:00 PM
To: huggingface/transformers ***@***.***>
Cc: goodaytar ***@***.***>; Mention ***@***.***>
Subject: Re: [huggingface/transformers] BertForSequenceClassification does not support 'device_map':"auto" yet (Issue #25296)
Hi @goodaytar<https://github.com/goodaytar>, thanks for raising this issue!
Yes, the BERT models don't support the use of device_map=xxx yet. In the full error message, you should have seen:
BertForSequenceClassification not support `device_map="auto"`. To implement support, the model class needs to implement the `_no_split_modules` attribute.
In order to enable this the _no_split_modules attribute needs to be implemented for the model. If you or anyone else in the community would like to open a PR to add this, we'd be very happy to review!
—
Reply to this email directly, view it on GitHub<#25296 (comment)>, or unsubscribe<https://github.com/notifications/unsubscribe-auth/APRZ52NZV7QNND7TR7TMJDTXTTL4RANCNFSM6AAAAAA3DCB7KY>.
You are receiving this because you were mentioned.Message ID: ***@***.***>
|
In order to know how to properly place the model onto difference devices, the models need to have For some modules, it's necessary to place all of the weights on the same device e.g. like In order to add, it'll be a case of iterating to find the modules that should be split or not. Once implemented, the accelerate tests should be run and pass. This should be tested with 1 and 2 GPUs. |
And how do I find the modules that should be split or not? |
@goodaytar You'll need to experiment with the model to find out which modules should be split. I suggest starting with an empty list and looking at similar models to see how they set You can inspect where the layers are allocated by using device_map = infer_auto_device_map(model, no_split_module_classes=[]) The modules that can be added will be the layers defined in the modeling file e.g. Once set, you can try running the accelerate tests (with GPUs!) to confirm the mapping works. If not, then inspect the device map. |
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
Hi @amyeroberts, I would like to add the |
@tanaymeh Great! From next week, I'll be off for a few weeks. Please ping @younesbelkada for review in that time. |
@tanaymeh that would be really great, in few words, you just need to make sure to add the module names that contain any skip connection to avoid potential device mismatch issues |
That makes sense @younesbelkada! Will create a PR for this. |
Hi @tanaymeh , |
@younesbelkada |
@younesbelkada Any updates? We can't wait to use this great feature. |
@Hambaobao I am working on the PR for this feature but waiting for a revert from @younesbelkada! |
any update on this issue? or anyone fixed it? |
Any update on this issue ? |
ValueError: SiglipVisionModel does not support Same for Siglip? |
@lucasjinreal There are many models which don't yet have this enabled. I've opened a feature request to add this for vision and multimodal models which could have this added: #29786 |
System Info
I have trained a model and am now trying to load and quantise it but getting the error:
BertForSequenceClassification does not support 'device_map':"auto" yet
Code for loading is simply:
model = AutoModelForSequenceClassification.from_pretrained(model_dir, device_map='auto', load_in_8bit=True)
Help would be greatly appreciated!
Thanks,
Lee
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
model = AutoModelForSequenceClassification.from_pretrained(model_dir, device_map='auto', load_in_8bit=True)
Expected behavior
The model would load and be usable.
The text was updated successfully, but these errors were encountered: