Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

adding max_tokens to cli #8

Closed
wants to merge 1 commit into from
Closed

adding max_tokens to cli #8

wants to merge 1 commit into from

Conversation

ctr26
Copy link

@ctr26 ctr26 commented Jul 2, 2023

Adding max tokens to cli as using gpt3.5 crashes when the context length is 10k

@joshpxyne
Copy link
Owner

@ctr26 This will be solved with #2 - we'll have a mapping of model -> context window (max tokens) and we'll break down files and prompts accordingly.

@marina727
Copy link

Ааа

@gianpaj
Copy link

gianpaj commented Jul 4, 2023

I don't have access to gpt-4-32k. How can I use gpt-4? I get this error even with these changes

openai.error.InvalidRequestError: This model's maximum context length is 8192 tokens. However, you requested 10601 tokens (601 in the messages, 10000 in the completion). Please reduce the length of the messages or completion.
        max_tokens: int = typer.Option(8192),
    ):

    ai = AI(
        model=model,
        temperature=temperature,
        max_tokens=int(max_tokens),
    )

@joshpxyne
Copy link
Owner

@gianpaj The output also contributes to the number of tokens. If your model has a max context window of 8k, you're probably better off making max_tokens 4k or so.

@danixv9
Copy link

danixv9 commented Jul 5, 2023

vvdd

@Ran-Mewo
Copy link

Is it also possible to also add something that can change the openai base url? So this could work with the microsoft azure openai endpoint or proxies

@joshpxyne
Copy link
Owner

@Ran-Mewo Yes definitely, I'll try to get to this later - feel free to also submit a PR for this if you'd like

@joshpxyne joshpxyne closed this Oct 25, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants