fix: ollama and lm studio url issue fix for docker and build #1008
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Docker Host URL Resolution Fix for LMStudio and Ollama Providers
Overview
This PR addresses connectivity issues between the application running in Docker containers and local LLM providers (LMStudio and Ollama). It implements proper host URL resolution by replacing localhost/127.0.0.1 with host.docker.internal when running in a Docker environment.
Key Changes
1. Docker Environment Detection
RUNNING_IN_DOCKER=true
environment variable to Dockerfile configurations2. Provider Base URL Resolution
3. Provider Settings Access Fix
providerSettings?.[this.name]
Technical Details
Provider Settings Fix
Fixed provider configuration initialization with proper settings access and defaults:
This change:
providerSettings?.[this.name]
URL Resolution Implementation
Key changes in the provider classes:
System Changes
RUNNING_IN_DOCKER
in both development and production Dockerfile configurationsTesting
Migration Impact
RUNNING_IN_DOCKER
needs to be set in Docker environments