diff --git a/docs/core_docs/docs/integrations/llms/azure.ipynb b/docs/core_docs/docs/integrations/llms/azure.ipynb index 607eeb2b6263..c2c2740a98cd 100644 --- a/docs/core_docs/docs/integrations/llms/azure.ipynb +++ b/docs/core_docs/docs/integrations/llms/azure.ipynb @@ -44,11 +44,7 @@ "## Overview\n", "### Integration details\n", "\n", - "- TODO: Fill in table features.\n", - "- TODO: Remove JS support link if not relevant, otherwise ensure link is correct.\n", - "- TODO: Make sure API reference links are correct.\n", - "\n", - "| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/docs/integrations/llms/openai) | Package downloads | Package latest |\n", + "| Class | Package | Local | Serializable | [PY support](https://python.langchain.com/docs/integrations/llms/azure_openai) | Package downloads | Package latest |\n", "| :--- | :--- | :---: | :---: | :---: | :---: | :---: |\n", "| [AzureOpenAI](https://api.js.langchain.com/classes/langchain_openai.AzureOpenAI.html) | [@langchain/openai](https://api.js.langchain.com/modules/langchain_openai.html) | ❌ | ✅ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/openai?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/openai?style=flat-square&label=%20&) |\n", "\n", diff --git a/docs/core_docs/docs/integrations/text_embedding/azure_openai.ipynb b/docs/core_docs/docs/integrations/text_embedding/azure_openai.ipynb new file mode 100644 index 000000000000..98e139900b25 --- /dev/null +++ b/docs/core_docs/docs/integrations/text_embedding/azure_openai.ipynb @@ -0,0 +1,448 @@ +{ + "cells": [ + { + "cell_type": "raw", + "id": "afaf8039", + "metadata": { + "vscode": { + "languageId": "raw" + } + }, + "source": [ + "---\n", + "sidebar_label: Azure OpenAI\n", + "---" + ] + }, + { + "cell_type": "markdown", + "id": "9a3d6f34", + "metadata": {}, + "source": [ + "# AzureOpenAIEmbeddings\n", + "\n", + "This will help you get started with AzureOpenAIEmbeddings [embedding models](/docs/concepts#embedding-models) using LangChain. For detailed documentation on `AzureOpenAIEmbeddings` features and configuration options, please refer to the [API reference](https://api.js.langchain.com/classes/langchain_openai.AzureOpenAIEmbeddings.html).\n", + "\n", + "## Overview\n", + "### Integration details\n", + "\n", + "[Azure OpenAI](https://azure.microsoft.com/products/ai-services/openai-service/) is a cloud service to help you quickly develop generative AI experiences with a diverse set of prebuilt and curated models from OpenAI, Meta and beyond.\n", + "\n", + "LangChain.js supports integration with [Azure OpenAI](https://azure.microsoft.com/products/ai-services/openai-service/) using the new Azure integration in the [OpenAI SDK](https://github.com/openai/openai-node).\n", + "\n", + "You can learn more about Azure OpenAI and its difference with the OpenAI API on [this page](https://learn.microsoft.com/azure/ai-services/openai/overview). If you don't have an Azure account, you can [create a free account](https://azure.microsoft.com/free/) to get started.\n", + "\n", + "```{=mdx}\n", + "\n", + ":::info\n", + "\n", + "Previously, LangChain.js supported integration with Azure OpenAI using the dedicated [Azure OpenAI SDK](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/openai/openai). This SDK is now deprecated in favor of the new Azure integration in the OpenAI SDK, which allows to access the latest OpenAI models and features the same day they are released, and allows seamless transition between the OpenAI API and Azure OpenAI.\n", + "\n", + "If you are using Azure OpenAI with the deprecated SDK, see the [migration guide](#migration-from-azure-openai-sdk) to update to the new API.\n", + "\n", + ":::\n", + "\n", + "```\n", + "\n", + "\n", + "| Class | Package | Local | [Py support](https://python.langchain.com/docs/integrations/text_embedding/azureopenai/) | Package downloads | Package latest |\n", + "| :--- | :--- | :---: | :---: | :---: | :---: |\n", + "| [AzureOpenAIEmbeddings](https://api.js.langchain.com/classes/langchain_openai.AzureOpenAIEmbeddings.html) | [@langchain/openai](https://api.js.langchain.com/modules/langchain_openai.html) | ❌ | ✅ | ![NPM - Downloads](https://img.shields.io/npm/dm/@langchain/openai?style=flat-square&label=%20&) | ![NPM - Version](https://img.shields.io/npm/v/@langchain/openai?style=flat-square&label=%20&) |\n", + "\n", + "## Setup\n", + "\n", + "To access AzureOpenAIEmbeddings embedding models you'll need to create an Azure account, get an API key, and install the `@langchain/openai` integration package.\n", + "\n", + "### Credentials\n", + "\n", + "You'll need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following [this guide](https://learn.microsoft.com/azure/ai-services/openai/how-to/create-resource?pivots=web-portal).\n", + "\n", + "Once you have your instance running, make sure you have the name of your instance and key. You can find the key in the Azure Portal, under the \"Keys and Endpoint\" section of your instance.\n", + "\n", + "If you're using Node.js, you can define the following environment variables to use the service:\n", + "\n", + "```bash\n", + "AZURE_OPENAI_API_INSTANCE_NAME=\n", + "AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME=\n", + "AZURE_OPENAI_API_KEY=\n", + "AZURE_OPENAI_API_VERSION=\"2024-02-01\"\n", + "```\n", + "\n", + "If you want to get automated tracing of your model calls you can also set your [LangSmith](https://docs.smith.langchain.com/) API key by uncommenting below:\n", + "\n", + "```bash\n", + "# export LANGCHAIN_TRACING_V2=\"true\"\n", + "# export LANGCHAIN_API_KEY=\"your-api-key\"\n", + "```\n", + "\n", + "### Installation\n", + "\n", + "The LangChain AzureOpenAIEmbeddings integration lives in the `@langchain/openai` package:\n", + "\n", + "```{=mdx}\n", + "import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n", + "import Npm2Yarn from \"@theme/Npm2Yarn\";\n", + "\n", + "\n", + "\n", + "\n", + " @langchain/openai\n", + "\n", + "\n", + ":::info\n", + "\n", + "You can find the list of supported API versions in the [Azure OpenAI documentation](https://learn.microsoft.com/azure/ai-services/openai/reference).\n", + "\n", + ":::\n", + "\n", + ":::tip\n", + "\n", + "If `AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME` is not defined, it will fall back to the value of `AZURE_OPENAI_API_DEPLOYMENT_NAME` for the deployment name. The same applies to the `azureOpenAIApiEmbeddingsDeploymentName` parameter in the `AzureOpenAIEmbeddings` constructor, which will fall back to the value of `azureOpenAIApiDeploymentName` if not defined.\n", + "\n", + ":::\n", + "\n", + "```" + ] + }, + { + "cell_type": "markdown", + "id": "45dd1724", + "metadata": {}, + "source": [ + "## Instantiation\n", + "\n", + "Now we can instantiate our model object and generate chat completions:" + ] + }, + { + "cell_type": "code", + "execution_count": 1, + "id": "9ea7a09b", + "metadata": {}, + "outputs": [], + "source": [ + "import { AzureOpenAIEmbeddings } from \"@langchain/openai\";\n", + "\n", + "const embeddings = new AzureOpenAIEmbeddings({\n", + " azureOpenAIApiKey: \"\", // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY\n", + " azureOpenAIApiInstanceName: \"\", // In Node.js defaults to process.env.AZURE_OPENAI_API_INSTANCE_NAME\n", + " azureOpenAIApiEmbeddingsDeploymentName: \"\", // In Node.js defaults to process.env.AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME\n", + " azureOpenAIApiVersion: \"\", // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION\n", + " maxRetries: 1,\n", + "});" + ] + }, + { + "cell_type": "markdown", + "id": "77d271b6", + "metadata": {}, + "source": [ + "## Indexing and Retrieval\n", + "\n", + "Embedding models are often used in retrieval-augmented generation (RAG) flows, both as part of indexing data as well as later retrieving it. For more detailed instructions, please see our RAG tutorials under the [working with external knowledge tutorials](/docs/tutorials/#working-with-external-knowledge).\n", + "\n", + "Below, see how to index and retrieve data using the `embeddings` object we initialized above. In this example, we will index and retrieve a sample document using the demo [`MemoryVectorStore`](/docs/integrations/vectorstores/memory)." + ] + }, + { + "cell_type": "code", + "execution_count": 2, + "id": "d817716b", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "LangChain is the framework for building context-aware reasoning applications\n" + ] + } + ], + "source": [ + "// Create a vector store with a sample text\n", + "import { MemoryVectorStore } from \"langchain/vectorstores/memory\";\n", + "\n", + "const text = \"LangChain is the framework for building context-aware reasoning applications\";\n", + "\n", + "const vectorstore = await MemoryVectorStore.fromDocuments(\n", + " [{ pageContent: text, metadata: {} }],\n", + " embeddings,\n", + ");\n", + "\n", + "// Use the vector store as a retriever that returns a single document\n", + "const retriever = vectorstore.asRetriever(1);\n", + "\n", + "// Retrieve the most similar text\n", + "const retrievedDocuments = await retriever.invoke(\"What is LangChain?\");\n", + "\n", + "retrievedDocuments[0].pageContent;" + ] + }, + { + "cell_type": "markdown", + "id": "e02b9855", + "metadata": {}, + "source": [ + "## Direct Usage\n", + "\n", + "Under the hood, the vectorstore and retriever implementations are calling `embeddings.embedDocument(...)` and `embeddings.embedQuery(...)` to create embeddings for the text(s) used in `fromDocuments` and the retriever's `invoke` operations, respectively.\n", + "\n", + "You can directly call these methods to get embeddings for your own use cases.\n", + "\n", + "### Embed single texts\n", + "\n", + "You can embed queries for search with `embedQuery`. This generates a vector representation specific to the query:" + ] + }, + { + "cell_type": "code", + "execution_count": 3, + "id": "0d2befcd", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[\n", + " -0.024253517, -0.0054218727, 0.048715446, 0.020580322, 0.03180832,\n", + " 0.0028770117, -0.012367731, 0.037383243, -0.054915592, 0.032225136,\n", + " 0.00825818, -0.023888804, -0.01184671, 0.012257014, 0.016294925,\n", + " 0.009254632, 0.0051353113, -0.008889917, 0.016855022, 0.04207243,\n", + " 0.00082589936, -0.011664353, 0.00818654, 0.029020859, -0.012335167,\n", + " -0.019603407, 0.0013945447, 0.05538451, -0.011625277, -0.008153976,\n", + " 0.038607642, -0.03811267, -0.0074440846, 0.047647353, -0.00927417,\n", + " 0.024201415, -0.0069230637, -0.008538228, 0.003910912, 0.052805457,\n", + " -0.023159374, 0.0014352495, -0.038659744, 0.017141584, 0.005587948,\n", + " 0.007971618, -0.016920151, 0.06658646, -0.0016916894, 0.045667473,\n", + " -0.042202685, -0.03983204, -0.04160351, -0.011729481, -0.055905532,\n", + " 0.012543576, 0.0038848612, 0.007919516, 0.010915386, 0.0033117384,\n", + " -0.007548289, -0.030427614, -0.041890074, 0.036002535, -0.023771575,\n", + " -0.008792226, -0.049444873, 0.016490309, -0.0060568666, 0.040196754,\n", + " 0.014106638, -0.014575557, -0.0017356506, -0.011234511, -0.012517525,\n", + " 0.008362384, 0.01253055, 0.036158845, 0.008297256, -0.0010908874,\n", + " -0.014888169, -0.020489143, 0.018965157, -0.057937514, -0.0037122732,\n", + " 0.004402626, -0.00840146, 0.042984217, -0.04936672, -0.03714878,\n", + " 0.004969236, 0.03707063, 0.015396165, -0.02055427, 0.01988997,\n", + " 0.030219207, -0.021257648, 0.01340326, 0.003692735, 0.012595678\n", + "]\n" + ] + } + ], + "source": [ + "const singleVector = await embeddings.embedQuery(text);\n", + "\n", + "console.log(singleVector.slice(0, 100));" + ] + }, + { + "cell_type": "markdown", + "id": "1b5a7d03", + "metadata": {}, + "source": [ + "### Embed multiple texts\n", + "\n", + "You can embed multiple texts for indexing with `embedDocuments`. The internals used for this method may (but do not have to) differ from embedding queries:" + ] + }, + { + "cell_type": "code", + "execution_count": 4, + "id": "2f4d6e97", + "metadata": {}, + "outputs": [ + { + "name": "stdout", + "output_type": "stream", + "text": [ + "[\n", + " -0.024253517, -0.0054218727, 0.048715446, 0.020580322, 0.03180832,\n", + " 0.0028770117, -0.012367731, 0.037383243, -0.054915592, 0.032225136,\n", + " 0.00825818, -0.023888804, -0.01184671, 0.012257014, 0.016294925,\n", + " 0.009254632, 0.0051353113, -0.008889917, 0.016855022, 0.04207243,\n", + " 0.00082589936, -0.011664353, 0.00818654, 0.029020859, -0.012335167,\n", + " -0.019603407, 0.0013945447, 0.05538451, -0.011625277, -0.008153976,\n", + " 0.038607642, -0.03811267, -0.0074440846, 0.047647353, -0.00927417,\n", + " 0.024201415, -0.0069230637, -0.008538228, 0.003910912, 0.052805457,\n", + " -0.023159374, 0.0014352495, -0.038659744, 0.017141584, 0.005587948,\n", + " 0.007971618, -0.016920151, 0.06658646, -0.0016916894, 0.045667473,\n", + " -0.042202685, -0.03983204, -0.04160351, -0.011729481, -0.055905532,\n", + " 0.012543576, 0.0038848612, 0.007919516, 0.010915386, 0.0033117384,\n", + " -0.007548289, -0.030427614, -0.041890074, 0.036002535, -0.023771575,\n", + " -0.008792226, -0.049444873, 0.016490309, -0.0060568666, 0.040196754,\n", + " 0.014106638, -0.014575557, -0.0017356506, -0.011234511, -0.012517525,\n", + " 0.008362384, 0.01253055, 0.036158845, 0.008297256, -0.0010908874,\n", + " -0.014888169, -0.020489143, 0.018965157, -0.057937514, -0.0037122732,\n", + " 0.004402626, -0.00840146, 0.042984217, -0.04936672, -0.03714878,\n", + " 0.004969236, 0.03707063, 0.015396165, -0.02055427, 0.01988997,\n", + " 0.030219207, -0.021257648, 0.01340326, 0.003692735, 0.012595678\n", + "]\n", + "[\n", + " -0.033366997, 0.010419146, 0.0118083665, -0.040441725, 0.0020355924,\n", + " -0.015808804, -0.023629595, -0.0066180876, -0.040004376, 0.020053642,\n", + " -0.0010797002, -0.03900105, -0.009956073, 0.0027896944, 0.003305828,\n", + " -0.034010153, 0.009833873, 0.0061164247, 0.022536227, 0.029147884,\n", + " 0.017789727, 0.03182342, 0.010869357, 0.031849146, -0.028093107,\n", + " 0.008283865, -0.0145610785, 0.01645196, -0.029430874, -0.02508313,\n", + " 0.046178687, -0.01722375, -0.010046115, 0.013101112, 0.0044538635,\n", + " 0.02197025, 0.03985002, 0.007955855, 0.0008819293, 0.012657333,\n", + " 0.014368132, -0.014007963, -0.03722594, 0.031617608, -0.011570398,\n", + " 0.039052505, 0.0020018267, 0.023706773, -0.0046950476, 0.056083307,\n", + " -0.08412496, -0.043425974, -0.015512952, 0.015950298, -0.03624834,\n", + " -0.0053317733, -0.037251666, 0.0046339477, 0.04193385, 0.023475237,\n", + " -0.021378545, 0.013699248, -0.026009277, 0.050757967, -0.0494202,\n", + " 0.0007874656, -0.07208506, 0.015885983, -0.003259199, 0.015127057,\n", + " 0.0068946453, -0.035373647, -0.005875241, -0.0032238255, -0.04185667,\n", + " -0.022047428, 0.0014326327, -0.0070940237, -0.0027864785, -0.016271876,\n", + " 0.005097021, 0.034473225, 0.012361481, -0.026498076, 0.0067274245,\n", + " -0.026330855, -0.006132504, 0.008180959, -0.049368747, -0.032337945,\n", + " 0.011049441, 0.00186194, -0.012097787, 0.01930758, 0.07059293,\n", + " 0.029713862, 0.04337452, -0.0048461896, -0.019976463, 0.011473924\n", + "]\n" + ] + } + ], + "source": [ + "const text2 = \"LangGraph is a library for building stateful, multi-actor applications with LLMs\";\n", + "\n", + "const vectors = await embeddings.embedDocuments([text, text2]);\n", + "\n", + "console.log(vectors[0].slice(0, 100));\n", + "console.log(vectors[1].slice(0, 100));" + ] + }, + { + "cell_type": "markdown", + "id": "2d918b72", + "metadata": {}, + "source": [ + "## Using Azure Managed Identity\n", + "\n", + "If you're using Azure Managed Identity, you can configure the credentials like this:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "29ccc1e1", + "metadata": {}, + "outputs": [], + "source": [ + "import {\n", + " DefaultAzureCredential,\n", + " getBearerTokenProvider,\n", + "} from \"@azure/identity\";\n", + "import { AzureOpenAIEmbeddings } from \"@langchain/openai\";\n", + "\n", + "const credentials = new DefaultAzureCredential();\n", + "const azureADTokenProvider = getBearerTokenProvider(\n", + " credentials,\n", + " \"https://cognitiveservices.azure.com/.default\"\n", + ");\n", + "\n", + "const modelWithManagedIdentity = new AzureOpenAIEmbeddings({\n", + " azureADTokenProvider,\n", + " azureOpenAIApiInstanceName: \"\",\n", + " azureOpenAIApiEmbeddingsDeploymentName: \"\",\n", + " azureOpenAIApiVersion: \"\",\n", + "});\n" + ] + }, + { + "cell_type": "markdown", + "id": "1909f724", + "metadata": {}, + "source": [ + "## Using a different domain\n", + "\n", + "If your instance is hosted under a domain other than the default `openai.azure.com`, you'll need to use the alternate `AZURE_OPENAI_BASE_PATH` environment variable.\n", + "For example, here's how you would connect to the domain `https://westeurope.api.microsoft.com/openai/deployments/{DEPLOYMENT_NAME}`:" + ] + }, + { + "cell_type": "code", + "execution_count": null, + "id": "b4b16f32", + "metadata": {}, + "outputs": [], + "source": [ + "import { AzureOpenAIEmbeddings } from \"@langchain/openai\";\n", + "\n", + "const modelDifferentDomain = new AzureOpenAIEmbeddings({\n", + " azureOpenAIApiKey: \"\", // In Node.js defaults to process.env.AZURE_OPENAI_API_KEY\n", + " azureOpenAIApiEmbeddingsDeploymentName: \"\", // In Node.js defaults to process.env.AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME\n", + " azureOpenAIApiVersion: \"\", // In Node.js defaults to process.env.AZURE_OPENAI_API_VERSION\n", + " azureOpenAIBasePath:\n", + " \"https://westeurope.api.microsoft.com/openai/deployments\", // In Node.js defaults to process.env.AZURE_OPENAI_BASE_PATH\n", + "});\n" + ] + }, + { + "cell_type": "markdown", + "id": "7b2e885a", + "metadata": {}, + "source": [ + "## Migration from Azure OpenAI SDK\n", + "\n", + "If you are using the deprecated Azure OpenAI SDK with the `@langchain/azure-openai` package, you can update your code to use the new Azure integration following these steps:\n", + "\n", + "1. Install the new `@langchain/openai` package and remove the previous `@langchain/azure-openai` package:\n", + " ```bash npm2yarn\n", + " npm install @langchain/openai\n", + " npm uninstall @langchain/azure-openai\n", + " ```\n", + "2. Update your imports to use the new `AzureOpenAIEmbeddings` classe from the `@langchain/openai` package:\n", + " ```typescript\n", + " import { AzureOpenAIEmbeddings } from \"@langchain/openai\";\n", + " ```\n", + "3. Update your code to use the new `AzureOpenAIEmbeddings` class and pass the required parameters:\n", + "\n", + " ```typescript\n", + " const model = new AzureOpenAIEmbeddings({\n", + " azureOpenAIApiKey: \"\",\n", + " azureOpenAIApiInstanceName: \"\",\n", + " azureOpenAIApiEmbeddingsDeploymentName:\n", + " \"\",\n", + " azureOpenAIApiVersion: \"\",\n", + " });\n", + " ```\n", + "\n", + " Notice that the constructor now requires the `azureOpenAIApiInstanceName` parameter instead of the `azureOpenAIEndpoint` parameter, and adds the `azureOpenAIApiVersion` parameter to specify the API version.\n", + "\n", + " - If you were using Azure Managed Identity, you now need to use the `azureADTokenProvider` parameter to the constructor instead of `credentials`, see the [Azure Managed Identity](#using-azure-managed-identity) section for more details.\n", + "\n", + " - If you were using environment variables, you now have to set the `AZURE_OPENAI_API_INSTANCE_NAME` environment variable instead of `AZURE_OPENAI_API_ENDPOINT`, and add the `AZURE_OPENAI_API_VERSION` environment variable to specify the API version.\n" + ] + }, + { + "cell_type": "markdown", + "id": "8938e581", + "metadata": {}, + "source": [ + "## API reference\n", + "\n", + "For detailed documentation of all AzureOpenAIEmbeddings features and configurations head to the API reference: https://api.js.langchain.com/classes/langchain_openai.AzureOpenAIEmbeddings.html" + ] + } + ], + "metadata": { + "kernelspec": { + "display_name": "TypeScript", + "language": "typescript", + "name": "tslab" + }, + "language_info": { + "codemirror_mode": { + "mode": "typescript", + "name": "javascript", + "typescript": true + }, + "file_extension": ".ts", + "mimetype": "text/typescript", + "name": "typescript", + "version": "3.7.2" + } + }, + "nbformat": 4, + "nbformat_minor": 5 +} diff --git a/docs/core_docs/docs/integrations/text_embedding/azure_openai.mdx b/docs/core_docs/docs/integrations/text_embedding/azure_openai.mdx deleted file mode 100644 index 4701d77bec9f..000000000000 --- a/docs/core_docs/docs/integrations/text_embedding/azure_openai.mdx +++ /dev/null @@ -1,122 +0,0 @@ ---- -keywords: [AzureOpenAIEmbeddings] ---- - -import CodeBlock from "@theme/CodeBlock"; - -# Azure OpenAI - -[Azure OpenAI](https://azure.microsoft.com/products/ai-services/openai-service/) is a cloud service to help you quickly develop generative AI experiences with a diverse set of prebuilt and curated models from OpenAI, Meta and beyond. - -LangChain.js supports integration with [Azure OpenAI](https://azure.microsoft.com/products/ai-services/openai-service/) using the new Azure integration in the [OpenAI SDK](https://github.com/openai/openai-node). - -You can learn more about Azure OpenAI and its difference with the OpenAI API on [this page](https://learn.microsoft.com/azure/ai-services/openai/overview). If you don't have an Azure account, you can [create a free account](https://azure.microsoft.com/free/) to get started. - -:::info - -Previously, LangChain.js supported integration with Azure OpenAI using the dedicated [Azure OpenAI SDK](https://github.com/Azure/azure-sdk-for-js/tree/main/sdk/openai/openai). This SDK is now deprecated in favor of the new Azure integration in the OpenAI SDK, which allows to access the latest OpenAI models and features the same day they are released, and allows seemless transition between the OpenAI API and Azure OpenAI. - -If you are using Azure OpenAI with the deprecated SDK, see the [migration guide](#migration-from-azure-openai-sdk) to update to the new API. - -::: - -## Setup - -You'll first need to install the [`@langchain/openai`](https://www.npmjs.com/package/@langchain/openai) package: - -import IntegrationInstallTooltip from "@mdx_components/integration_install_tooltip.mdx"; - - - -```bash npm2yarn -npm install -S @langchain/openai -``` - -You'll also need to have an Azure OpenAI instance deployed. You can deploy a version on Azure Portal following [this guide](https://learn.microsoft.com/azure/ai-services/openai/how-to/create-resource?pivots=web-portal). - -Once you have your instance running, make sure you have the name of your instance and key. You can find the key in the Azure Portal, under the "Keys and Endpoint" section of your instance. - -If you're using Node.js, you can define the following environment variables to use the service: - -```bash -AZURE_OPENAI_API_INSTANCE_NAME= -AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME= -AZURE_OPENAI_API_KEY= -AZURE_OPENAI_API_VERSION="2024-02-01" -``` - -Alternatively, you can pass the values directly to the `AzureOpenAI` constructor: - -import AzureOpenAI from "@examples/models/embeddings/azure_openai.ts"; - -import UnifiedModelParamsTooltip from "@mdx_components/unified_model_params_tooltip.mdx"; - - - -{AzureOpenAI} - -:::info - -You can find the list of supported API versions in the [Azure OpenAI documentation](https://learn.microsoft.com/azure/ai-services/openai/reference). - -::: - -::tip - -If `AZURE_OPENAI_API_EMBEDDINGS_DEPLOYMENT_NAME` is not defined, it will fall back to the value of `AZURE_OPENAI_API_DEPLOYMENT_NAME` for the deployment name. The same applies to the `azureOpenAIApiEmbeddingsDeploymentName` parameter in the `AzureOpenAIEmbeddings` constructor, which will fall back to the value of `azureOpenAIApiDeploymentName` if not defined. - -::: - -### Using Azure Managed Identity - -If you're using Azure Managed Identity, you can configure the credentials like this: - -import AzureOpenAIManagedIdentity from "@examples/models/embeddings/azure_openai-managed_identity.ts"; - -{AzureOpenAIManagedIdentity} - -### Using a different domain - -If your instance is hosted under a domain other than the default `openai.azure.com`, you'll need to use the alternate `AZURE_OPENAI_BASE_PATH` environment variable. -For example, here's how you would connect to the domain `https://westeurope.api.microsoft.com/openai/deployments/{DEPLOYMENT_NAME}`: - -import AzureOpenAIBasePath from "@examples/models/embeddings/azure_openai-base_path.ts"; - -{AzureOpenAIBasePath} - -### Usage example - -import Example from "@examples/embeddings/azure_openai.ts"; - -{Example} - -## Migration from Azure OpenAI SDK - -If you are using the deprecated Azure OpenAI SDK with the `@langchain/azure-openai` package, you can update your code to use the new Azure integration following these steps: - -1. Install the new `@langchain/openai` package and remove the previous `@langchain/azure-openai` package: - ```bash npm2yarn - npm install @langchain/openai - npm uninstall @langchain/azure-openai - ``` -2. Update your imports to use the new `AzureOpenAIEmbeddings` classe from the `@langchain/openai` package: - ```typescript - import { AzureOpenAIEmbeddings } from "@langchain/openai"; - ``` -3. Update your code to use the new `AzureOpenAIEmbeddings` class and pass the required parameters: - - ```typescript - const model = new AzureOpenAIEmbeddings({ - azureOpenAIApiKey: "", - azureOpenAIApiInstanceName: "", - azureOpenAIApiEmbeddingsDeploymentName: - "", - azureOpenAIApiVersion: "", - }); - ``` - - Notice that the constructor now requires the `azureOpenAIApiInstanceName` parameter instead of the `azureOpenAIEndpoint` parameter, and adds the `azureOpenAIApiVersion` parameter to specify the API version. - - - If you were using Azure Managed Identity, you now need to use the `azureADTokenProvider` parameter to the constructor instead of `credentials`, see the [Azure Managed Identity](#using-azure-managed-identity) section for more details. - - - If you were using environment variables, you now have to set the `AZURE_OPENAI_API_INSTANCE_NAME` environment variable instead of `AZURE_OPENAI_API_ENDPOINT`, and add the `AZURE_OPENAI_API_VERSION` environment variable to specify the API version.