Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for proxy URL for gemini #1878

Open
alianos- opened this issue Nov 13, 2024 · 3 comments
Open

Support for proxy URL for gemini #1878

alianos- opened this issue Nov 13, 2024 · 3 comments

Comments

@alianos-
Copy link

alianos- commented Nov 13, 2024

OpenAI and Anthropic each have ENV variables that you can set in the .env file and their clients will pick them over the default if they are set.

The Gemini client however does not do that, so in order to provide a custom host URL it needs to be passed explicitly on gemini client instantiation.

The code changes for the chatbot to support this are negligible but I am not sure how to contribute, and, I think one of the files I changed should not be edited directly. In essence all we need to do is add the env variable, and then pass it to the gemini client with the following change

from

    const googleModel = genAI.getGenerativeModel(
      { model: chatSettings.model }
    )

to

    const googleModel = genAI.getGenerativeModel(
      { model: chatSettings.model },
      { baseUrl: profile.google_gemini_base_url }
    )

I attached a diff.txt file so maybe a dev can try git apply diff.txt or even copy by hand, it's like 4 changed lines to support this.
diff.txt

@kaixinchen2024
Copy link

Gemini has make the same way to call openAI.Now u can use baseURL of Gemini's to use gemini.

@a-lianos
Copy link

I'm not sure exactly what you mean, but the whole point is not to use the base url of Gemini, but to use a proxy base url that leads to Gemini. For example if I can access
openai at openai.kong.svc.cluster.local, right now I can go to .env file of chatbot, add OPENAI_BASE_URL=openai.kong.svc.cluster.local, and the openai client the chatbot is using will make use of that env variable and point requests at the proxy not openai. Anthropic is the same. Gemini though is not using an env variable and the value needs to be passed explicitly, hence the feature request.

@leda152
Copy link

leda152 commented Jan 6, 2025

sir can you give me main idea of how to built the chat bot model for capstone project. give me clue idea of this about chat bot.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants