-
Notifications
You must be signed in to change notification settings - Fork 12k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat: support Ernie-lite-pro-128k #12161
Conversation
So how could I update this? I don't find this change at https://github.com/langgenius/dify/tree/main/api/core/model_runtime/model_providers/wenxin/llm |
just waiting for it to be merged~ |
Thanks, I found your branch. |
max: 2048
I think max_output_tokens should be 8192 not 2048. |
|
Thanks for your information. max output tokens from 2048 to 4096 |
Summary
support Ernie-lite-pro-128k
Resolves #12130
Screenshots
Checklist
Important
Please review the checklist below before submitting your pull request.
dev/reformat
(backend) andcd web && npx lint-staged
(frontend) to appease the lint gods