We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Hi! I just want to say thank you for create this SOTA model, good job!
I have a couple of questions:
The text was updated successfully, but these errors were encountered:
Actually it supports a wide range of languages. But not all languages are stable. You may test it yourself.
See part of our annealing phase top language distribution here. There are more but the top ones shown below.
Sorry, something went wrong.
For fine-tuning, we are working on a usable hugging face fine-tune code.
Probably will show an example of enabling BPM control.
Learning new languages will require large amount of data and compute, since you may need to continual pretrain the stage 1 7B LM.
No branches or pull requests
Hi! I just want to say thank you for create this SOTA model, good job!
I have a couple of questions:
The text was updated successfully, but these errors were encountered: