-
Notifications
You must be signed in to change notification settings - Fork 80
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
"Token pattern not found in the list" error #24
Comments
Have you updated |
I was using Version 4.41.2, but the error persists with the newest version. To add to the context, I was using the default tokenizer from
And I was trying to fine-tune the gating layer of the model with LoRA:
Configs:
|
Same error. I believe the reason is about the function defined in the The deeper reason lies on the dataset side. I find the code always falls on a sample with the "response" are null. For example the sample with prompt_id I believe the solution lies on removing these bad samples. |
Same error. |
Hi there,
I got this "Token pattern not found in the list" error when I tried out the model under no_grad() condition. Would you take a look at this please? Many thanks!! See below for the code and error message:
The text was updated successfully, but these errors were encountered: