Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

models2.json: use ChatML for Mistral OpenOrca #1935

Merged
merged 2 commits into from
Feb 6, 2024
Merged

Conversation

cebtenzzre
Copy link
Member

This PR fixes the output of Mistral OpenOrca for a user prompt such as:

I have a question (but in Dutch)

Before, the model would be likely to spew random things in Dutch endlessly, and get stuck in a repetitive loop. Using ChatML makes the model behave much more like a personal assistant.

Before:

Details Screenshot 2024-02-06 at 12 31 16 PM

After:

Details Screenshot 2024-02-06 at 12 36 13 PM

This system prompt encourages chain-of-thought style responses, which likely improves the model's problem-solving skills, but is perhaps not a good default for a friendly model for the user to have a conversation with?

Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
@cebtenzzre
Copy link
Member Author

I tweaked the system prompt for more friendly output:
Screenshot 2024-02-06 at 12 41 43 PM

@cebtenzzre cebtenzzre merged commit 78a26cc into main Feb 6, 2024
6 of 10 checks passed
@cebtenzzre cebtenzzre deleted the mistralorca-prompt branch February 6, 2024 17:43
dpsalvatierra pushed a commit to dpsalvatierra/gpt4all that referenced this pull request Feb 16, 2024
Signed-off-by: Jared Van Bortel <jared@nomic.ai>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Development

Successfully merging this pull request may close these issues.

None yet

2 participants