Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Completion model provider should be traitlet-configurable #781

Open
krassowski opened this issue May 10, 2024 · 5 comments
Open

Completion model provider should be traitlet-configurable #781

krassowski opened this issue May 10, 2024 · 5 comments
Labels
enhancement New feature or request project:completion

Comments

@krassowski
Copy link
Member

@krassowski Will it also be better if we can set something like default_completion_model and default_chat_model when running jupyter lab --config using config.json?
Previously, we can only do something like:

{
    "AiExtension": {
        "default_language_model": "Self-Hosted:deepseek-coder-6.7b-instruct",
        "allowed_providers": "Self-Hosted",
        "model_parameters": {
            "Self-Hosted:deepseek-coder-6.7b-base": {},
            "Self-Hosted:deepseek-coder-6.7b-instruct": {}
        },
        "default_api_keys":{}
    }
}

It may be better to modify the extension.py also. Thank you.

Originally posted by @yuhon0528 in #711 (comment)

@yuhon0528
Copy link

Also, since we separated into two models (chat and completion), the API keys will most likely be different.
However, for example if I use OpenAI Compliable APIs for both chat and completion models, there will only be one OPEN_API_KEY field for authentication.

How can I authenticate both models with different API keys with only one field? Thanks.

@krassowski
Copy link
Member Author

For now, we would probably only want to only add default_completion_model (rather than rename default_language_model to default_chat_model) to avoid breaking compatibility.

@krassowski
Copy link
Member Author

The API keys are a key-value mapping so maybe that is not a problem? If you use cohere for chat you would set COHERE_API_KEY, if you use openai for completions you would set OPENAI_API_KEY, so the default_api_keys would be:

"default_api_keys": {
  "COHERE_API_KEY": "MY_KEY_1",
  "OPENAI_API_KEY": "MY_KEY_1"
}

and respective providers would take the key they need. Does it makes sense?

@yuhon0528
Copy link

yuhon0528 commented May 10, 2024

But in fact is that I am using two models that are both with OpenAI compatible APIs, that is:

{
    "AiExtension": {
        "default_language_model": "Self-Hosted:deepseek-coder-6.7b-instruct",
        "allowed_providers": "Self-Hosted",
        "model_parameters": {
            "Self-Hosted:deepseek-coder-6.7b-base": {},
            "Self-Hosted:deepseek-coder-6.7b-instruct": {}
        },
        "default_api_keys":{
            "OPENAI_API_KEY": "deepseek-coder-6.7b-base-key",
            "OPENAI_API_KEY": "deepseek-coder-6.7b-instruct-key"
        }
    }
}

When we use those self-hosting models for chat and completions, most likely will use something like vllm to provide OpenAI compatible API Servers.

Hence, that is the fact that both completion and chat model will be OpenAI API like.

@dcieslak19973
Copy link

Hoping to help get some momentum on this issue. A link to my comments on a related/duplicate issue:

#953 (comment)

TL;DR - it does not currently seem possible to configure the Completions solely from the jupyter_jupyter_ai_config.json file

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request project:completion
Projects
None yet
Development

No branches or pull requests

4 participants