Skip to content

Commit

Permalink
feat(community): Add new LLM provider Novita AI (#7231)
Browse files Browse the repository at this point in the history
Co-authored-by: jacoblee93 <jacoblee93@gmail.com>
  • Loading branch information
jasonhp and jacoblee93 authored Dec 3, 2024
1 parent 310cac2 commit 54b692a
Show file tree
Hide file tree
Showing 5 changed files with 447 additions and 0 deletions.
2 changes: 2 additions & 0 deletions docs/core_docs/.gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -340,6 +340,8 @@ docs/integrations/chat/openai.md
docs/integrations/chat/openai.mdx
docs/integrations/chat/ollama.md
docs/integrations/chat/ollama.mdx
docs/integrations/chat/novita.md
docs/integrations/chat/novita.mdx
docs/integrations/chat/mistral.md
docs/integrations/chat/mistral.mdx
docs/integrations/chat/ibm.md
Expand Down
206 changes: 206 additions & 0 deletions docs/core_docs/docs/integrations/chat/novita.ipynb
Original file line number Diff line number Diff line change
@@ -0,0 +1,206 @@
{
"cells": [
{
"cell_type": "raw",
"metadata": {
"vscode": {
"languageId": "raw"
}
},
"source": [
"---\n",
"sidebar_label: Novita AI\n",
"---"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# ChatNovita\n",
"\n",
"Delivers an affordable, reliable, and simple inference platform for running top LLM models.\n",
"\n",
"You can find all the models we support here: [Novita AI Featured Models](https://novita.ai/model-api/product/llm-api?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) or request the [Models API](https://novita.ai/docs/model-api/reference/llm/models.html?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) to get all available models.\n",
"\n",
"Try the [Novita AI Llama 3 API Demo](https://novita.ai/model-api/product/llm-api/playground#meta-llama-llama-3.1-8b-instruct?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) today!"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Overview\n",
"\n",
"### Model features\n",
"| [Tool calling](/docs/how_to/tool_calling) | [Structured output](/docs/how_to/structured_output/) | JSON mode | [Image input](/docs/how_to/multimodal_inputs/) | Audio input | Video input | [Token-level streaming](/docs/how_to/chat_streaming/) | Native async | [Token usage](/docs/how_to/chat_token_usage_tracking/) | [Logprobs](/docs/how_to/logprobs/) |\n",
"| :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: | :---: |\n",
"| ❌ | ✅ | ✅ | ❌ | ❌ | ❌ | ❌ | ❌ | ✅ | ❌ |"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Setup\n",
"\n",
"To access Novita AI models you'll need to create a Novita account and get an API key.\n",
"\n",
"### Credentials\n",
"\n",
"Head to [this page](https://novita.ai/settings#key-management?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) to sign up to Novita AI and generate an API key. Once you've done this set the NOVITA_API_KEY environment variable:\n",
"\n",
"```bash\n",
"export NOVITA_API_KEY=\"your-api-key\"\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Installation\n",
"\n",
"The LangChain Novita integration lives in the `@langchain-community` package:\n",
"\n",
"```{=mdx}\n",
"import IntegrationInstallTooltip from \"@mdx_components/integration_install_tooltip.mdx\";\n",
"import Npm2Yarn from \"@theme/Npm2Yarn\";\n",
"\n",
"<IntegrationInstallTooltip></IntegrationInstallTooltip>\n",
"\n",
"<Npm2Yarn>\n",
" @langchain/community @langchain/core\n",
"</Npm2Yarn>\n",
"```"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Instantiation\n",
"\n",
"Now we can instantiate our model object and generate chat completions. Try the [Novita AI Llama 3 API Demo](https://novita.ai/model-api/product/llm-api/playground#meta-llama-llama-3.1-8b-instruct?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link) today!"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "javascript"
}
},
"outputs": [],
"source": [
"import { ChatNovitaAI } from \"@langchain/community/chat_models/novita\";\n",
"\n",
"const llm = new ChatNovitaAI({\n",
" model: \"meta-llama/llama-3.1-8b-instruct\",\n",
" temperature: 0,\n",
" // other params...\n",
"})"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Invocation"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "javascript"
}
},
"outputs": [],
"source": [
"const aiMsg = await llm.invoke([\n",
" {\n",
" role: \"system\",\n",
" content: \"You are a helpful assistant that translates English to French. Translate the user sentence.\",\n",
" },\n",
" {\n",
" role: \"human\",\n",
" content: \"I love programming.\"\n",
" },\n",
"]);"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "javascript"
}
},
"outputs": [],
"source": [
"console.log(aiMsg.content)"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## Chaining\n",
"\n",
"We can [chain](/docs/how_to/sequence) our model with a prompt template like so:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"vscode": {
"languageId": "javascript"
}
},
"outputs": [],
"source": [
"import { ChatPromptTemplate } from \"@langchain/core/prompts\"\n",
"\n",
"const prompt = ChatPromptTemplate.fromMessages(\n",
" [\n",
" [\n",
" \"system\",\n",
" \"You are a helpful assistant that translates {input_language} to {output_language}.\",\n",
" ],\n",
" [\"human\", \"{input}\"],\n",
" ]\n",
")\n",
"\n",
"const chain = prompt.pipe(llm);\n",
"await chain.invoke(\n",
" {\n",
" input_language: \"English\",\n",
" output_language: \"German\",\n",
" input: \"I love programming.\",\n",
" }\n",
")"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"## API reference\n",
"\n",
"For detailed documentation of Novita AI LLM APIs, head to [Novita AI LLM API reference](https://novita.ai/docs/model-api/reference/llm/llm.html?utm_source=github_langchain&utm_medium=github_readme&utm_campaign=link)\n"
]
}
],
"metadata": {
"language_info": {
"name": "python"
}
},
"nbformat": 4,
"nbformat_minor": 2
}
1 change: 1 addition & 0 deletions libs/langchain-community/langchain.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -178,6 +178,7 @@ export const config = {
"chat_models/llama_cpp": "chat_models/llama_cpp",
"chat_models/minimax": "chat_models/minimax",
"chat_models/moonshot": "chat_models/moonshot",
"chat_models/novita": "chat_models/novita",
"chat_models/ollama": "chat_models/ollama",
"chat_models/portkey": "chat_models/portkey",
"chat_models/premai": "chat_models/premai",
Expand Down
147 changes: 147 additions & 0 deletions libs/langchain-community/src/chat_models/novita.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,147 @@
import type {
BaseChatModelParams,
LangSmithParams,
} from "@langchain/core/language_models/chat_models";
import {
type OpenAIClient,
type ChatOpenAICallOptions,
type OpenAIChatInput,
type OpenAICoreRequestOptions,
ChatOpenAI,
} from "@langchain/openai";
import { getEnvironmentVariable } from "@langchain/core/utils/env";

type NovitaUnsupportedArgs =
| "frequencyPenalty"
| "presencePenalty"
| "logitBias"
| "functions";

type NovitaUnsupportedCallOptions = "functions" | "function_call";

export interface ChatNovitaCallOptions
extends Omit<ChatOpenAICallOptions, NovitaUnsupportedCallOptions> {
response_format: {
type: "json_object";
schema: Record<string, unknown>;
};
}

export interface ChatNovitaInput
extends Omit<OpenAIChatInput, "openAIApiKey" | NovitaUnsupportedArgs>,
BaseChatModelParams {
/**
* Novita API key
* @default process.env.NOVITA_API_KEY
*/
novitaApiKey?: string;
/**
* API key alias
* @default process.env.NOVITA_API_KEY
*/
apiKey?: string;
}

/**
* Novita chat model implementation
*/
export class ChatNovitaAI extends ChatOpenAI<ChatNovitaCallOptions> {
static lc_name() {
return "ChatNovita";
}

_llmType() {
return "novita";
}

get lc_secrets(): { [key: string]: string } | undefined {
return {
novitaApiKey: "NOVITA_API_KEY",
apiKey: "NOVITA_API_KEY",
};
}

lc_serializable = true;

constructor(
fields?: Partial<
Omit<OpenAIChatInput, "openAIApiKey" | NovitaUnsupportedArgs>
> &
BaseChatModelParams & {
novitaApiKey?: string;
apiKey?: string;
}
) {
const novitaApiKey =
fields?.apiKey ||
fields?.novitaApiKey ||
getEnvironmentVariable("NOVITA_API_KEY");

if (!novitaApiKey) {
throw new Error(
`Novita API key not found. Please set the NOVITA_API_KEY environment variable or provide the key into "novitaApiKey"`
);
}

super({
...fields,
model: fields?.model || "gryphe/mythomax-l2-13b",
apiKey: novitaApiKey,
configuration: {
baseURL: "https://api.novita.ai/v3/openai/",
},
});
}

getLsParams(options: this["ParsedCallOptions"]): LangSmithParams {
const params = super.getLsParams(options);
params.ls_provider = "novita";
return params;
}

toJSON() {
const result = super.toJSON();

if (
"kwargs" in result &&
typeof result.kwargs === "object" &&
result.kwargs != null
) {
delete result.kwargs.openai_api_key;
delete result.kwargs.configuration;
}

return result;
}

async completionWithRetry(
request: OpenAIClient.Chat.ChatCompletionCreateParamsStreaming,
options?: OpenAICoreRequestOptions
): Promise<AsyncIterable<OpenAIClient.Chat.Completions.ChatCompletionChunk>>;

async completionWithRetry(
request: OpenAIClient.Chat.ChatCompletionCreateParamsNonStreaming,
options?: OpenAICoreRequestOptions
): Promise<OpenAIClient.Chat.Completions.ChatCompletion>;

async completionWithRetry(
request:
| OpenAIClient.Chat.ChatCompletionCreateParamsStreaming
| OpenAIClient.Chat.ChatCompletionCreateParamsNonStreaming,
options?: OpenAICoreRequestOptions
): Promise<
| AsyncIterable<OpenAIClient.Chat.Completions.ChatCompletionChunk>
| OpenAIClient.Chat.Completions.ChatCompletion
> {
delete request.frequency_penalty;
delete request.presence_penalty;
delete request.logit_bias;
delete request.functions;

if (request.stream === true) {
return super.completionWithRetry(request, options);
}

return super.completionWithRetry(request, options);
}
}
Loading

0 comments on commit 54b692a

Please sign in to comment.