Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

server : enable cache_prompt by default #10501

Merged
merged 1 commit into from
Nov 25, 2024

Conversation

ggerganov
Copy link
Owner

It's almost always better to have cache_prompt: true, so enabling it by default.

@ggerganov ggerganov merged commit 47f931c into master Nov 25, 2024
62 checks passed
@ggerganov ggerganov deleted the gg/server-enable-cache-prompt branch November 25, 2024 19:50
arthw pushed a commit to arthw/llama.cpp that referenced this pull request Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant