Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Different request methods lead to different results. #142

Closed
djshowtime opened this issue Dec 2, 2022 · 1 comment
Closed

Different request methods lead to different results. #142

djshowtime opened this issue Dec 2, 2022 · 1 comment

Comments

@djshowtime
Copy link

djshowtime commented Dec 2, 2022

Hi,

I requested through python-client-library "openai" and restful request—all with default parameters.

With identical input text, it returned different results. Besides the difference in the output text, the other difference appears in "finish_reason". One is "stop", and the other is "length".

Could you please let me know the reason why the results were different?

@hallacy
Copy link
Collaborator

hallacy commented Dec 2, 2022

Hi @djshowtime! Thanks for writing in.

If I had to take a guess, it sounds like you're running into our model's non-determinimism. If you set the temperature equal to 0, you should be able to see the same response from both openai-python and restful requests. See https://beta.openai.com/docs/quickstart/adjust-your-settings for more information.

Does that help?

borisdayma pushed a commit to borisdayma/openai-python that referenced this issue Dec 7, 2022
* overload output type depending on stream literal (openai#142)

* Bump to v22

* [numpy] change version (openai#143)

* [numpy] change version

* update comments

* no version for numpy

* Fix timeouts (openai#137)

* Fix timeouts

* Rename to request_timeout and add to readme

* Dev/hallacy/request timeout takes tuples (openai#144)

* Add tuple typing for request_timeout

* imports

* [api_requestor] Log request_id with response (openai#145)

* Only import wandb as needed (openai#146)

Co-authored-by: Felipe Petroski Such <felipe@openai.com>
Co-authored-by: Henrique Oliveira Pinto <hponde@openai.com>
Co-authored-by: Rachel Lim <rachel@openai.com>
cgayapr pushed a commit to cgayapr/openai-python that referenced this issue Dec 14, 2024
* overload output type depending on stream literal (openai#142)

* Bump to v22

* [numpy] change version (openai#143)

* [numpy] change version

* update comments

* no version for numpy

* Fix timeouts (openai#137)

* Fix timeouts

* Rename to request_timeout and add to readme

* Dev/hallacy/request timeout takes tuples (openai#144)

* Add tuple typing for request_timeout

* imports

* [api_requestor] Log request_id with response (openai#145)

* Only import wandb as needed (openai#146)

Co-authored-by: Felipe Petroski Such <felipe@openai.com>
Co-authored-by: Henrique Oliveira Pinto <hponde@openai.com>
Co-authored-by: Rachel Lim <rachel@openai.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants