Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

support for ollama api keys #399

Closed
theoden8 opened this issue May 3, 2024 · 1 comment
Closed

support for ollama api keys #399

theoden8 opened this issue May 3, 2024 · 1 comment
Labels
t:question Further information is requested

Comments

@theoden8
Copy link

theoden8 commented May 3, 2024

Feature request

Please, support api keys for ollama.

Motivation

Ollama is not compatible with openai API, e.g. /v1/models. It is possible to hide ollama instance behind pre-determined api keys.

Your contribution

I believe this is the part where this feature would be implemented: https://github.com/davidmigloz/langchain_dart/blob/main/packages/langchain_ollama/lib/src/chat_models/chat_ollama.dart#L148. The api key is passed as Bearer token in request header.

@theoden8 theoden8 added the t:enhancement New feature or request label May 3, 2024
@github-project-automation github-project-automation bot moved this to 📋 Backlog in LangChain.dart May 3, 2024
@davidmigloz davidmigloz added t:question Further information is requested and removed t:enhancement New feature or request labels May 3, 2024
@davidmigloz
Copy link
Owner

Hello @theoden8,

You can provide any custom header using theheaders field:

final chat = ChatOllama(
  baseUrl: 'https://your-base-url.com/api',
  headers: { 'api-key': 'YOUR_API_KEY' },
  defaultOptions: ChatOllamaOptions(
    model: 'llama3',
    temperature: 0,
  ),
);

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
t:question Further information is requested
Projects
Status: Done
Development

No branches or pull requests

2 participants