How to configure the custom endpoint for azure openai service? (APIM endpoint with key) #7316
-
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 9 replies
-
Hello @ronakshirolawala! I'm here to help you with any bugs, questions, or contributions you have. Let me know how I can assist you! I found a similar closed issue that might be relevant to your question: How to provide custom endpoint for Azure OpenAi service? (gateway URL). This issue was resolved by a pull request that added support for custom endpoints for Azure OpenAI [1]. To configure a custom endpoint for Azure OpenAI in LangChainJS, you can use the import { AzureChatOpenAI } from "@langchain/openai";
const model = new AzureChatOpenAI({
azureOpenAIBasePath: "your-APIM-endpoint",
azureOpenAIApiKey: "your-api-key",
azureOpenAIApiInstanceName: "your-instance-name",
azureOpenAIApiDeploymentName: "your-deployment-name",
azureOpenAIApiVersion: "your-api-version",
headers: {
"Api-Key": "your-api-key",
},
}); This setup allows you to specify a custom base path for your Azure OpenAI service, which can be your APIM endpoint. Make sure to replace the placeholders with your actual configuration values. This way, you can call the endpoint directly without exposing sensitive information on the client side [2][3]. |
Beta Was this translation helpful? Give feedback.
Hello @ronakshirolawala! I'm here to help you with any bugs, questions, or contributions you have. Let me know how I can assist you!
I found a similar closed issue that might be relevant to your question: How to provide custom endpoint for Azure OpenAi service? (gateway URL). This issue was resolved by a pull request that added support for custom endpoints for Azure OpenAI [1].
To configure a custom endpoint for Azure OpenAI in LangChainJS, you can use the
getEndpoint
function with the appropriate configuration parameters. Here's an example of how you can set it up: