Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

backend: fix a crash on inputs greater than n_ctx #2498

Merged
merged 1 commit into from
Jul 1, 2024

Conversation

cebtenzzre
Copy link
Member

@cebtenzzre cebtenzzre commented Jul 1, 2024

This fixes a regression in commit 4fc4d94 ("fix chat-style prompt templates (#1970)"), which moved some return statements into a new function (LLModel::decodePrompt) without making them return from the parent as well.


🚀 This description was created by Ellipsis for commit 3409b98

Summary:

Fixes crash on inputs greater than n_ctx by updating LLModel::decodePrompt to return a boolean and handle errors appropriately.

Key points:

  • Modified gpt4all-backend/llmodel.h:
    • Changed LLModel::decodePrompt return type from void to bool.
  • Updated gpt4all-backend/llmodel_shared.cpp:
    • Updated LLModel::prompt to handle decodePrompt return value and exit on error.
    • Modified LLModel::decodePrompt to return false on errors and true on success.

Generated with ❤️ by ellipsis.dev

This fixes a regression in commit 4fc4d94 ("fix chat-style prompt
templates (#1970)"), which moved some return statements into a new
function (LLModel::decodePrompt) without making them return from the
parent as well.

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
@cebtenzzre cebtenzzre requested a review from manyoso July 1, 2024 15:32
@manyoso manyoso merged commit bd307ab into main Jul 1, 2024
6 of 18 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants