-
Notifications
You must be signed in to change notification settings - Fork 26.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
tokenizer save_pretrained
can not handle non-string value in dtype
#33304
Comments
Hello! Is the |
Hello, I take a deeper look, it seems there is no argument named I found it is possible to avoid such TypeError by modifying the original Maybe you have smarter ways to fix it.
|
This issue has been automatically marked as stale because it has not had recent activity. If you think this still needs to be addressed please comment on this thread. Please note that issues that do not follow the contributing guidelines are likely to be ignored. |
This is related to the DPO traininer in trl? In that case PR should go there! Closing as this is a wrong usage of the |
System Info
python3.10
transformers 4.36.2
torch 2.1.2
torchaudio 2.1.2
torchvision 0.16.2
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Expected behavior
Explanation
My conjecture is that, when I load tokenizer with bfloat16,
tokenizer.dtype
is assigned bytorch.bfloat16
. When saving the tokenizer, the dtype was not handled.The text was updated successfully, but these errors were encountered: