Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

o1 streaming error #31

Closed
simonw opened this issue Dec 20, 2024 · 1 comment · Fixed by #32
Closed

o1 streaming error #31

simonw opened this issue Dec 20, 2024 · 1 comment · Fixed by #32

Comments

@simonw
Copy link
Contributor

simonw commented Dec 20, 2024

I tried this:

% gh models run o1
>>> hi

And got this:

Error: bad request
{
  "error": {
    "message": "Unsupported value: 'stream' does not support true with this model. Supported values are: false.",
    "type": "invalid_request_error",
    "param": "stream",
    "code": "unsupported_value"
  }
}

It looks like o1 doesn't support streaming, but gh models run tries to request streaming for it.

@simonw
Copy link
Contributor Author

simonw commented Dec 20, 2024

I think this code needs updating to also check for o1:

func (c *AzureClient) GetChatCompletionStream(ctx context.Context, req ChatCompletionOptions) (*ChatCompletionResponse, error) {
// Check if the model name is `o1-mini` or `o1-preview`
if req.Model == "o1-mini" || req.Model == "o1-preview" {
req.Stream = false
} else {
req.Stream = true
}

simonw added a commit to simonw/gh-models that referenced this issue Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

1 participant