Replies: 2 comments 3 replies
-
Have a check on 'user_suffix'? |
Beta Was this translation helpful? Give feedback.
1 reply
-
Sorry. It is I have implemented this feature in chatllm.cpp. I have tried to make a PR to |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I've got a question about generation using a prefix, meaning providing the first words to the LLM. Is it possible?
So for example, the prompt in plain text is:
So we prefix the assistant: "Explanation: " to be the next token to provide a first token
I'm using llama-server via curl / openai python library
Beta Was this translation helpful? Give feedback.
All reactions