-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add initial system prompt in ChatHandler and completion #28
Conversation
c1b8b79
to
1befdfa
Compare
src/provider.ts
Outdated
@@ -8,6 +8,30 @@ import { CompletionProvider } from './completion-provider'; | |||
import { getChatModel, IBaseCompleter } from './llm-models'; | |||
import { IAIProvider } from './token'; | |||
|
|||
export const chatSystemPrompt = (provider_name: string) => ` |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should this function take a single option, instead of raw parameters like provider_name
?
Wondering if this could make it easier to extend later.
Wondering why we would need a getter / setter on the completer for the |
Yes, and even with the settings it could only be a writable property, since there is nothing more in the getter / setter. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks!
Fixes #25
Use the same prompts as
jupyter-ai
(except for the model name, which is not part of theBaseChatModel
and may not be provided by all chat model).NOTES
Currently the completion prompt is not used by the codestral completer, because the codestral completer uses a text completion model, which doesn't seem to allow such prompts.
For a follow up PR, we should:
BaseCompleter
class that may fit with most of the completersFOLLOW UP
We could add a setting to let user define their own prompt #25 (comment)