-
-
Notifications
You must be signed in to change notification settings - Fork 6.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix stablelm #3038
Fix stablelm #3038
Conversation
@esmeetu Should we replace the old version with this? Do we have a reason to keep the old one? |
AFIAK the old version is needed for older huggingface version? |
@simon-mo We already updated our |
@WoosukKwon There only have small configuration changes on model and I think we can keep the backwards compatibility for a while. |
@esmeetu Could you explain more about the backward compatibility? Do you mean the situation where the user uses an older version of |
No, this will provide a more kindly user experience for those (including me) who just use a local offline old repo before update. Otherwise when upgrading vLLM, it will throw an error and let them confused since they don't know that repo update information and think this is must an issue about vLLM upgrade.🤯 |
@esmeetu Got it. Thanks for the explanation! |
Stabilityai has uploaded transformers implementation. https://huggingface.co/stabilityai/stablelm-3b-4e1t/discussions/10
Ouput: