You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The issue you're experiencing with the model list not being synchronized in the Docker version of Dify could be related to several factors. Here are some potential areas to investigate:
Model Configuration: Ensure that the models you expect to see are properly configured in the codebase. The model list is populated in the file api/core/model_runtime/model_providers/azure_openai/_constant.py. Check if the models "Gemini 2.0 Pro" and the new Flash version are included in the relevant lists such as LLM_BASE_MODELS[1].
Environment Variables: Verify the environment variables related to model providers in your Docker setup. These variables can control which models are displayed in the UI. Specifically, check the POSITION_PROVIDER_PINS, POSITION_PROVIDER_INCLUDES, and POSITION_PROVIDER_EXCLUDES settings in your .env file to ensure they are not excluding the models you want to see [2].
ML Frameworks: Make sure that the necessary ML frameworks (e.g., PyTorch, TensorFlow, Flax) are installed and properly configured in your Docker environment. Missing frameworks can lead to models not being available [3].
Docker Configuration: If you are using local models or integrating with external services, ensure that your Docker configuration allows access to these services. This might involve setting the correct network configurations or environment variables.
By checking these areas, you should be able to identify and resolve the issue with the model list synchronization in your Docker setup.
Self Checks
Dify version
0.15.2
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
The Gemini 2.0 Pro and the new Flash version are not listed in the model list.
✔️ Expected Behavior
No response
❌ Actual Behavior
No response
The text was updated successfully, but these errors were encountered: