Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Show async as one of the features on llm logs --options #806

Closed
simonw opened this issue Feb 28, 2025 · 1 comment
Closed

Show async as one of the features on llm logs --options #806

simonw opened this issue Feb 28, 2025 · 1 comment
Labels
enhancement New feature or request

Comments

@simonw
Copy link
Owner

simonw commented Feb 28, 2025

Following:

It should look like this:

OpenAI Chat: gpt-4.5-preview (aliases: gpt-4.5)
  Options:
    temperature: float
    max_tokens: int
    top_p: float
    frequency_penalty: float
    presence_penalty: float
    stop: str
    logit_bias: dict, str
    seed: int
    json_object: boolean
  Attachment types:
    image/gif, image/jpeg, image/png, image/webp
  Features:
  - streaming
  - schemas
  - async
@simonw simonw added the enhancement New feature or request label Feb 28, 2025
@simonw simonw closed this as completed in b829cd9 Feb 28, 2025
@simonw
Copy link
Owner Author

simonw commented Feb 28, 2025

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

1 participant