Skip to content

Commit

Permalink
feat: add moderations endpoint
Browse files Browse the repository at this point in the history
  • Loading branch information
transitive-bullshit committed Aug 20, 2024
1 parent 3258768 commit d730f72
Show file tree
Hide file tree
Showing 3 changed files with 21 additions and 2 deletions.
7 changes: 5 additions & 2 deletions readme.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,13 +11,13 @@ Unfortunately, the official [openai-node](https://github.com/openai/openai-node)
- You want a fast and small client that doesn't patch fetch
- Supports all envs with native fetch: Node 18+, browsers, Deno, Cloudflare Workers, etc
- Package size: `openai-fetch` is [~14kb](https://bundlephobia.com/package/openai-fetch) and `openai` is [~142kb](https://bundlephobia.com/package/openai)
- You only need the chat, completions, and embeddings
- You only need the chat, completions, embeddings, and moderations

### Use `openai-node` if you need:

- Your runtime doesn't have native fetch support
- Your app can't handle native ESM code
- Endpoints other than chat, completions, and embeddings
- Endpoints other than chat, completions, embeddings, and moderations
- Aren't concerned with lib size or fetch patching

## Install
Expand Down Expand Up @@ -59,6 +59,9 @@ client.streamCompletion(params: CompletionStreamParams): Promise<CompletionStrea

// Generate one or more embeddings
client.createEmbeddings(params: EmbeddingParams): Promise<EmbeddingResponse>

// Checks for potentially harmful content
client.createModeration(params: ModerationParams): Promise<ModerationResponse>
```

### Type Definitions
Expand Down
13 changes: 13 additions & 0 deletions src/openai-client.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,8 @@ import {
type CompletionStreamResponse,
type EmbeddingParams,
type EmbeddingResponse,
type ModerationParams,
type ModerationResponse,
} from './types.js';

export type ConfigOpts = {
Expand Down Expand Up @@ -132,4 +134,15 @@ export class OpenAIClient {
.json();
return response;
}

/** Given some input text, outputs if the model classifies it as potentially harmful across several categories. */
async createModeration(
params: ModerationParams,
opts?: RequestOpts
): Promise<ModerationResponse> {
const response: OpenAI.ModerationCreateResponse = await this.getApi(opts)
.post('moderations', { json: params })
.json();
return response;
}
}
3 changes: 3 additions & 0 deletions src/types.ts
Original file line number Diff line number Diff line change
Expand Up @@ -83,3 +83,6 @@ export type CompletionStreamResponse = ReadableStream<OpenAI.Completion>;

export type EmbeddingParams = OpenAI.EmbeddingCreateParams;
export type EmbeddingResponse = OpenAI.CreateEmbeddingResponse;

export type ModerationParams = OpenAI.ModerationCreateParams;
export type ModerationResponse = OpenAI.ModerationCreateResponse;

0 comments on commit d730f72

Please sign in to comment.