Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(beta): add streaming and function calling helpers #409

Merged
merged 1 commit into from
Oct 30, 2023

Conversation

stainless-bot
Copy link
Collaborator

No description provided.

@stainless-bot stainless-bot merged commit 510c1f3 into next Oct 30, 2023
@stainless-bot stainless-bot deleted the feat-beta-add-streaming-and-func branch October 30, 2023 19:20
@stainless-bot stainless-bot mentioned this pull request Oct 30, 2023
Copy link
Collaborator

@athyuttamre athyuttamre left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Post-merge review with some non-blocking comments cc @rattrayalex @RobertCraigie

});

// or, equivalently:
for await (const part of stream) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

const chunk of stream?

```

`openai.chat.completions.stream()` returns a `ChatCompletionStreamingRunner`, which emits events, has an async
iterator, and exposes a helper methods to accumulate chunks into a convenient shape and make it easy to reason
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo: "a helper methods" -> "helper methods"

openai.chat.completions.runFunctions({ stream: true, … }, options?): ChatCompletionStreamingRunner
```

`openai.chat.completions.runFunctions()` returns either a Runner for automating function calls with chat
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

typo: "either" feels out of place

model: string;
}

export namespace ChatCompletionSnapshot {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are these auto generated or manually kept in sync?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are manually defined, everything in the src/lib folder is not touched by codegen and is manually written on our end.

functions: RunnableFunctions<FunctionsArgs>;
};

export class ChatCompletionRunner extends AbstractChatCompletionRunner<ChatCompletionRunnerEvents> {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems fine but just curious why we have this abstract class separation?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There are some subtle differences between ChatCompletionStream which does streaming but not function running and ChatCompletionRunner which does function running but not streaming… but tbh I think they've shrunk over time and we could perhaps get rid of this separation. Let me give it a try.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

ah, yeah, the types are different which is intentional, best to keep separate.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants