-
Notifications
You must be signed in to change notification settings - Fork 87
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gemini #30
Gemini #30
Conversation
now
and
|
TODO: I am not sure how the whole system works, but these two tests works well now.
|
This is looking really good! The only major note is that I don't think you actually want to support FIM. If you look at the mistral FIM, you see we supply both a prompt and suffix. The API for Mistral FIM specifically performs FIM. I think you want to support generateText and generateContent endpoints for the Gemini API. Instead of do_fim you will probably want do_generate. You can see the Ollama implementation for an example: lsp-ai/src/transformer_backends/ollama.rs Line 63 in cd46ecf
I feel like the Ollama implementation is probably a better example to go off of than the Mistral FIM one. Let me know if this doesn't make sense and I am happy to clarify anything! |
ok, find this problem: openai use this #[derive(Debug, Clone, Deserialize, Serialize)]
#[serde(deny_unknown_fields)]
pub struct ChatMessage {
pub role: String,
pub content: String,
} but gemini use "contents": [
{
"role":"user",
"parts":[{
"text": "Pretend you're a snowman and stay in character for each response."}]
},
{
"role": "model",
"parts":[{
"text": "Hello! It's so cold! Isn't that great?"}]
},
{
"role": "user",
"parts":[{
"text": "What's your favorite season of the year?"}]
}
] the chat functionality seems need to change the system, so support completion first. |
I would create a new struct
Let me know if this isn't helpful, and I can make a PR into your branch with some suggestions. |
now it works well.
with
|
I actually made a pr into your branch: asukaminato0721#1 If you can merge that I should be able to merge this in. |
Hey sorry I tested it with my editor and had to add a few more updates to get the prompt to format correctly: asukaminato0721#2 I'm not able to get good completion responses from it. When I test it using the default chat prompts I have outlined in the Configuration section it loves to include quotes in its response. We can merge this once you merge my pr in, but before I update the wiki it would be nice to have some prompts we know work well. |
cc @SilasMarvin mind take a look? |
fix #28
currently
works, code needs improve though.