GBNF Function Calling Grammar Generator for llama.cpp to make function calling with every model supporting grammar based sampling. (most models, I only had problems with Deepseek Code Instruct) #4273
Maximilian-Winter
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hello! 👋
I'd like to introduce a tool I've been developing: a GGML BNF Grammar Generator tailored for llama.cpp.
🔍 Features:
At the moment it supports following types as function parameter:
string
boolean
number
float
👨💻 Benefits for llama.cpp Users:
🔗 https://github.com/Maximilian-Winter/llama_cpp_function_calling
🤝 Looking for Input:
Would be happy to hear from you – whether it's feedback, potential contributions, or just a discussion on future improvements. Let's collaborate to enhance the llama.cpp coding experience!
Example output using OpenHermes and the example functions in
gpt_functions.py
Beta Was this translation helpful? Give feedback.
All reactions