Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(beta): add streaming and function calling helpers #409

Merged
merged 1 commit into from
Oct 30, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
115 changes: 114 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ You can import in Deno via:
<!-- x-release-please-start-version -->

```ts
import OpenAI from 'https://raw.githubusercontent.com/openai/openai-node/v4.14.1-deno/mod.ts';
import OpenAI from 'https://raw.githubusercontent.com/openai/openai-node/v4.14.2-deno/mod.ts';
```

<!-- x-release-please-end -->
Expand Down Expand Up @@ -102,6 +102,119 @@ Documentation for each method, request param, and response field are available i
> [!IMPORTANT]
> Previous versions of this SDK used a `Configuration` class. See the [v3 to v4 migration guide](https://github.com/openai/openai-node/discussions/217).

### Streaming responses

This library provides several conveniences for streaming chat completions, for example:

```ts
import OpenAI from 'openai';

const openai = new OpenAI();

async function main() {
const stream = await openai.beta.chat.completions.stream({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Say this is a test' }],
stream: true,
});

stream.on('content', (delta, snapshot) => {
process.stdout.write(delta);
});

// or, equivalently:
for await (const part of stream) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

const chunk of stream?

process.stdout.write(part.choices[0]?.delta?.content || '');
}

const chatCompletion = await stream.finalChatCompletion();
console.log(chatCompletion); // {id: "…", choices: […], …}
}

main();
```

Streaming with `openai.beta.chat.completions.stream({…})` exposes
[various helpers for your convenience](helpers.md#events) including event handlers and promises.

Alternatively, you can use `openai.chat.completions.create({ stream: true, … })`
which only returns an async iterable of the chunks in the stream and thus uses less memory
(it does not build up a final chat completion object for you).

If you need to cancel a stream, you can `break` from a `for await` loop or call `stream.abort()`.

### Automated function calls

We provide a `openai.beta.chat.completions.runFunctions({…})` convenience helper for using function calls
with the `/chat/completions` endpoint which automatically calls the JavaScript functions you provide
and sends their results back to the `/chat/completions` endpoint,
looping as long as the model requests function calls.

If you pass a `parse` function, it will automatically parse the `arguments` for you and returns any parsing errors to the model to attempt auto-recovery. Otherwise, the args will be passed to the function you provide as a string.

If you pass `function_call: {name: …}` instead of `auto`, it returns immediately after calling that function (and only loops to auto-recover parsing errors).

```ts
import OpenAI from 'openai';

const client = new OpenAI();

async function main() {
const runner = client.beta.chat.completions
.runFunctions({
model: 'gpt-3.5-turbo',
messages: [{ role: 'user', content: 'How is the weather this week?' }],
functions: [
{
function: getCurrentLocation,
parameters: { type: 'object', properties: {} },
},
{
function: getWeather,
parse: JSON.parse, // or use a validation library like zod for typesafe parsing.
parameters: {
type: 'object',
properties: {
location: { type: 'string' },
},
},
},
],
})
.on('message', (message) => console.log(message));

const finalContent = await runner.finalContent();
console.log();
console.log('Final content:', finalContent);
}

async function getCurrentLocation() {
return 'Boston'; // Simulate lookup
}

async function getWeather(args: { location: string }) {
const { location } = args;
// … do lookup …
return { temperature, precipitation };
}

main();

// {role: "user", content: "How's the weather this week?"}
// {role: "assistant", function_call: "getCurrentLocation", arguments: "{}"}
// {role: "function", name: "getCurrentLocation", content: "Boston"}
// {role: "assistant", function_call: "getWeather", arguments: '{"location": "Boston"}'}
// {role: "function", name: "getWeather", content: '{"temperature": "50degF", "preciptation": "high"}'}
// {role: "assistant", content: "It's looking cold and rainy - you might want to wear a jacket!"}
//
// Final content: "It's looking cold and rainy - you might want to wear a jacket!"
```

Like with `.stream()`, we provide a variety of [helpers and events](helpers.md#events).

Read more about various examples such as with integrating with [zod](helpers.md#integrate-with-zod),
[next.js](helpers.md#integrate-wtih-next-js), and [proxying a stream to the browser](helpers.md#proxy-streaming to-a-browser).

## File Uploads

Request parameters that correspond to file uploads can be passed in many different forms:
Expand Down
11 changes: 11 additions & 0 deletions api.md
Original file line number Diff line number Diff line change
Expand Up @@ -156,3 +156,14 @@ Methods:
- <code title="get /fine-tunes">client.fineTunes.<a href="./src/resources/fine-tunes.ts">list</a>() -> FineTunesPage</code>
- <code title="post /fine-tunes/{fine_tune_id}/cancel">client.fineTunes.<a href="./src/resources/fine-tunes.ts">cancel</a>(fineTuneId) -> FineTune</code>
- <code title="get /fine-tunes/{fine_tune_id}/events">client.fineTunes.<a href="./src/resources/fine-tunes.ts">listEvents</a>(fineTuneId, { ...params }) -> FineTuneEventsListResponse</code>

# Beta

## Chat

### Completions

Methods:

- <code>client.beta.chat.completions.<a href="./src/resources/beta/chat/completions.ts">runFunctions</a>(body, options?) -> ChatCompletionRunner | ChatCompletionStreamingRunner</code>
- <code>client.beta.chat.completions.<a href="./src/resources/beta/chat/completions.ts">stream</a>(body, options?) -> ChatCompletionStream</code>
88 changes: 87 additions & 1 deletion ecosystem-tests/node-ts-cjs-auto/tests/test.ts
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
import OpenAI, { toFile } from 'openai';
import OpenAI, { APIUserAbortError, toFile } from 'openai';
import { TranscriptionCreateParams } from 'openai/resources/audio/transcriptions';
import fetch from 'node-fetch';
import { File as FormDataFile, Blob as FormDataBlob } from 'formdata-node';
Expand Down Expand Up @@ -68,6 +68,92 @@ it(`streaming works`, async function () {
expect(chunks.map((c) => c.choices[0]?.delta.content || '').join('')).toBeSimilarTo('This is a test', 10);
});

it(`ChatCompletionStream works`, async function () {
const chunks: OpenAI.Chat.ChatCompletionChunk[] = [];
const contents: [string, string][] = [];
const messages: OpenAI.Chat.ChatCompletionMessage[] = [];
const chatCompletions: OpenAI.Chat.ChatCompletion[] = [];
let finalContent: string | undefined;
let finalMessage: OpenAI.Chat.ChatCompletionMessage | undefined;
let finalChatCompletion: OpenAI.Chat.ChatCompletion | undefined;

const stream = client.beta.chat.completions
.stream({
model: 'gpt-4',
messages: [{ role: 'user', content: 'Say this is a test' }],
})
.on('chunk', (chunk) => chunks.push(chunk))
.on('content', (delta, snapshot) => contents.push([delta, snapshot]))
.on('message', (message) => messages.push(message))
.on('chatCompletion', (completion) => chatCompletions.push(completion))
.on('finalContent', (content) => (finalContent = content))
.on('finalMessage', (message) => (finalMessage = message))
.on('finalChatCompletion', (completion) => (finalChatCompletion = completion));
const content = await stream.finalContent();

expect(content).toBeSimilarTo('This is a test', 10);
expect(chunks.length).toBeGreaterThan(0);
expect(contents.length).toBeGreaterThan(0);
for (const chunk of chunks) {
expect(chunk.id).toEqual(finalChatCompletion?.id);
expect(chunk.created).toEqual(finalChatCompletion?.created);
expect(chunk.model).toEqual(finalChatCompletion?.model);
}
expect(finalContent).toEqual(content);
expect(contents.at(-1)?.[1]).toEqual(content);
expect(finalMessage?.content).toEqual(content);
expect(finalChatCompletion?.choices?.[0]?.message.content).toEqual(content);
expect(messages).toEqual([finalMessage]);
expect(chatCompletions).toEqual([finalChatCompletion]);
expect(await stream.finalContent()).toEqual(content);
expect(await stream.finalMessage()).toEqual(finalMessage);
expect(await stream.finalChatCompletion()).toEqual(finalChatCompletion);
});

it(`aborting ChatCompletionStream works`, async function () {
const chunks: OpenAI.Chat.ChatCompletionChunk[] = [];
const contents: [string, string][] = [];
const messages: OpenAI.Chat.ChatCompletionMessage[] = [];
const chatCompletions: OpenAI.Chat.ChatCompletion[] = [];
let finalContent: string | undefined;
let finalMessage: OpenAI.Chat.ChatCompletionMessage | undefined;
let finalChatCompletion: OpenAI.Chat.ChatCompletion | undefined;
let emittedError: any;
let caughtError: any;
const controller = new AbortController();
const stream = client.beta.chat.completions
.stream(
{
model: 'gpt-4',
messages: [{ role: 'user', content: 'Say this is a test' }],
},
{ signal: controller.signal },
)
.on('error', (e) => (emittedError = e))
.on('chunk', (chunk) => chunks.push(chunk))
.on('content', (delta, snapshot) => {
contents.push([delta, snapshot]);
controller.abort();
})
.on('message', (message) => messages.push(message))
.on('chatCompletion', (completion) => chatCompletions.push(completion))
.on('finalContent', (content) => (finalContent = content))
.on('finalMessage', (message) => (finalMessage = message))
.on('finalChatCompletion', (completion) => (finalChatCompletion = completion));
try {
await stream.finalContent();
} catch (error) {
caughtError = error;
}
expect(caughtError).toBeInstanceOf(APIUserAbortError);
expect(finalContent).toBeUndefined();
expect(finalMessage).toBeUndefined();
expect(finalChatCompletion).toBeUndefined();
expect(chatCompletions).toEqual([]);
expect(chunks.length).toBeGreaterThan(0);
expect(contents.length).toBeGreaterThan(0);
});

it('handles formdata-node File', async function () {
const file = await fetch(url)
.then((x) => x.arrayBuffer())
Expand Down
2 changes: 2 additions & 0 deletions examples/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
yarn.lock
node_modules
142 changes: 142 additions & 0 deletions examples/function-call-diy.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,142 @@
#!/usr/bin/env -S npm run tsn -T

import OpenAI from 'openai';
import { ChatCompletionMessage, ChatCompletionMessageParam } from 'openai/resources/chat';

// gets API Key from environment variable OPENAI_API_KEY
const openai = new OpenAI();

const functions: OpenAI.Chat.ChatCompletionCreateParams.Function[] = [
{
name: 'list',
description: 'list queries books by genre, and returns a list of names of books',
parameters: {
type: 'object',
properties: {
genre: { type: 'string', enum: ['mystery', 'nonfiction', 'memoir', 'romance', 'historical'] },
},
},
},
{
name: 'search',
description: 'search queries books by their name and returns a list of book names and their ids',
parameters: {
type: 'object',
properties: {
name: { type: 'string' },
},
},
},
{
name: 'get',
description:
"get returns a book's detailed information based on the id of the book. Note that this does not accept names, and only IDs, which you can get by using search.",
parameters: {
type: 'object',
properties: {
id: { type: 'string' },
},
},
},
];

async function callFunction(function_call: ChatCompletionMessage.FunctionCall): Promise<any> {
const args = JSON.parse(function_call.arguments!);
switch (function_call.name) {
case 'list':
return await list(args['genre']);

case 'search':
return await search(args['name']);

case 'get':
return await get(args['id']);

default:
throw new Error('No function found');
}
}

async function main() {
const messages: ChatCompletionMessageParam[] = [
{
role: 'system',
content:
'Please use our book database, which you can access using functions to answer the following questions.',
},
{
role: 'user',
content:
'I really enjoyed reading To Kill a Mockingbird, could you recommend me a book that is similar and tell me why?',
},
];
console.log(messages[0]);
console.log(messages[1]);
console.log();

while (true) {
const completion = await openai.chat.completions.create({
model: 'gpt-3.5-turbo',
messages,
functions: functions,
});

const message = completion.choices[0]!.message;
messages.push(message);
console.log(message);

// If there is no function call, we're done and can exit this loop
if (!message.function_call) {
return;
}

// If there is a function call, we generate a new message with the role 'function'.
const result = await callFunction(message.function_call);
const newMessage = {
role: 'function' as const,
name: message.function_call.name!,
content: JSON.stringify(result),
};
messages.push(newMessage);

console.log(newMessage);
console.log();
}
}

const db = [
{
id: 'a1',
name: 'To Kill a Mockingbird',
genre: 'historical',
description: `Compassionate, dramatic, and deeply moving, "To Kill A Mockingbird" takes readers to the roots of human behavior - to innocence and experience, kindness and cruelty, love and hatred, humor and pathos. Now with over 18 million copies in print and translated into forty languages, this regional story by a young Alabama woman claims universal appeal. Harper Lee always considered her book to be a simple love story. Today it is regarded as a masterpiece of American literature.`,
},
{
id: 'a2',
name: 'All the Light We Cannot See',
genre: 'historical',
description: `In a mining town in Germany, Werner Pfennig, an orphan, grows up with his younger sister, enchanted by a crude radio they find that brings them news and stories from places they have never seen or imagined. Werner becomes an expert at building and fixing these crucial new instruments and is enlisted to use his talent to track down the resistance. Deftly interweaving the lives of Marie-Laure and Werner, Doerr illuminates the ways, against all odds, people try to be good to one another.`,
},
{
id: 'a3',
name: 'Where the Crawdads Sing',
genre: 'historical',
description: `For years, rumors of the “Marsh Girl” haunted Barkley Cove, a quiet fishing village. Kya Clark is barefoot and wild; unfit for polite society. So in late 1969, when the popular Chase Andrews is found dead, locals immediately suspect her.

But Kya is not what they say. A born naturalist with just one day of school, she takes life's lessons from the land, learning the real ways of the world from the dishonest signals of fireflies. But while she has the skills to live in solitude forever, the time comes when she yearns to be touched and loved. Drawn to two young men from town, who are each intrigued by her wild beauty, Kya opens herself to a new and startling world—until the unthinkable happens.`,
},
];

async function list(genre: string) {
return db.filter((item) => item.genre === genre).map((item) => ({ name: item.name, id: item.id }));
}

async function search(name: string) {
return db.filter((item) => item.name.includes(name)).map((item) => ({ name: item.name, id: item.id }));
}

async function get(id: string) {
return db.find((item) => item.id === id)!;
}

main();
Loading