Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: ollama and lm studio url issue fix for docker and build #1008

Merged
merged 2 commits into from
Jan 6, 2025

Conversation

thecodacus
Copy link
Collaborator

Docker Host URL Resolution Fix for LMStudio and Ollama Providers

Overview

This PR addresses connectivity issues between the application running in Docker containers and local LLM providers (LMStudio and Ollama). It implements proper host URL resolution by replacing localhost/127.0.0.1 with host.docker.internal when running in a Docker environment.

Key Changes

1. Docker Environment Detection

  • Added RUNNING_IN_DOCKER=true environment variable to Dockerfile configurations
  • Implemented detection logic in provider classes to handle URL transformation

2. Provider Base URL Resolution

  • Enhanced URL handling in LMStudio and Ollama providers
  • Added proper error handling for missing base URLs
  • Implemented consistent URL transformation for Docker environments

3. Provider Settings Access Fix

  • Fixed provider settings access by correctly referencing provider-specific settings using providerSettings?.[this.name]
  • Ensures proper configuration loading for each provider instance

Technical Details

Provider Settings Fix

Fixed provider configuration initialization with proper settings access and defaults:

let { baseUrl } = this.getProviderBaseUrlAndKey({
  apiKeys,
  providerSettings: providerSettings?.[this.name],
  serverEnv: serverEnv as any,
  defaultBaseUrlKey: 'OLLAMA_API_BASE_URL',
  defaultApiTokenKey: '',
});

This change:

  • Correctly accesses provider-specific settings using providerSettings?.[this.name]

URL Resolution Implementation

Key changes in the provider classes:

// LMStudio and Ollama Provider URL Resolution
if (typeof window === 'undefined') {
  /*
   * Running in Server
   * Backend: Check if we're running in Docker
   */
  const isDocker = process.env.RUNNING_IN_DOCKER === 'true';

  baseUrl = isDocker ? baseUrl.replace('localhost', 'host.docker.internal') : baseUrl;
  baseUrl = isDocker ? baseUrl.replace('127.0.0.1', 'host.docker.internal') : baseUrl;
}

System Changes

  • Added environment variable RUNNING_IN_DOCKER in both development and production Dockerfile configurations
  • Enhanced error handling for missing base URLs in provider configurations
  • Added debug logging for base URL resolution in both providers
  • Updated provider configuration to properly handle server-side URL transformation

Testing

  • Verified connectivity between Docker container and local LLM providers
  • Tested URL resolution in both Docker and non-Docker environments
  • Validated error handling for missing base URLs
  • Confirmed proper functioning of both LMStudio and Ollama providers

Migration Impact

  • No breaking changes to existing functionality
  • Added environment variable RUNNING_IN_DOCKER needs to be set in Docker environments
  • Existing Docker deployments will need to rebuild their containers

@thecodacus thecodacus merged commit 49c7129 into stackblitz-labs:main Jan 6, 2025
3 checks passed
JJ-Dynamite pushed a commit to val-x/valenClient that referenced this pull request Jan 9, 2025
…itz-labs#1008)

* fix: ollama and lm studio url issue fix for docker and build

* vite config fix
timoa pushed a commit to timoa/bolt.diy that referenced this pull request Jan 21, 2025
…itz-labs#1008)

* fix: ollama and lm studio url issue fix for docker and build

* vite config fix
JJ-Dynamite pushed a commit to val-x/valenClient that referenced this pull request Jan 29, 2025
…itz-labs#1008)

* fix: ollama and lm studio url issue fix for docker and build

* vite config fix
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant