Releases: OoriData/Toolio
Releases · OoriData/Toolio
0.5.2 - Schema output and tool-calling go brrr for December
Added
toolio.common.load_or_connect
convenience functionreddit_newsletter
multi-agent demo
Changed
- Make the
{json_schema}
template "cutout" configurable, and change the default (to#!JSON_SCHEMA!#
)
Fixed
- Clean up how optional dependencies are handled
- Tool-calling prompting enhancements
- Clean up HTTP client & server interpretation of tool-calling & schemata
0.5.1 - Upstream for Halloween
Added
- Demo
demo/re_act.py
common.response_text()
function to simplify usage
Fixed
- Usage pattern of KVCache
Changed
- Decode
json_schema
if given as a string
Removed
json_response
arg tollm_helper.complete()
; just go by whether json_schema is None
0.5.0 - Triumph of text (docs, better prompting, etc.)
Added
llm_helper.debug_model_manager
—a way to extract raw prompt & schema/tool-call info for debugging of underlying LLM behavior- docs beyond the README (
doc
folder) - test cases
- demo/algebra_tutor.py
- demo/blind_obedience.py
Changed
- use of logger rather than trace boolean, throughout
- further code modularizarion and reorg
- improvements to default prompting
- more elegant handling of install from an unsupported OS
Fixed
- handling of multi-trip scenarios
0.4.2 - Server startup middleware fix
Added
- notes on how to override prompting
Changed
- processing for function-calling system prompts
Fixed
- server startup 😬
0.4.1 - Async tool fixes
Added
- demo
demo/zipcode.py
- support for multiple workers & CORS headers (
--workers
&--cors_origin
cmdline option)
Fixed
- async tool definitions
0.4.0 - Local LLM loading and tool-calling interface
Added
toolio.responder
module, with coherent factoring fromserver.py
llm_helper.model_manager
convenience API for direct Python loading & inferencing over modelsllm_helper.extract_content
helper to simplify the OpenAI-style streaming completion responsestest/quick_check.py
for quick assessment of LLMs in Toolio- Mistral model type support
Changed
- Turn off prompt caching until we figure out #12
- Have responders return actual dicts, rather than label + JSON dump
- Factor out HTTP protocol schematics to a new module
- Handle more nuances of tool-calling tokenizer setup
- Harmonize tool definition patterns across invocation styles
Fixed
- More vector shape mamagement
Removed
- Legacy OpenAI-style function-calling support
0.3.1 - Post-PyCon Nigeria 2024 fixes
Added
trip_timeout
command line option fortoolio_request
- Support for mixtral model type
- Model loading timing
Fixed
0.3.0 - The PyCon Nigeria 2024 edition
Added
- tool/param.rename, e.g. for tool params which are Python keywords or reserved words
- API example in README
- Type coercion for tool parameters
- Ability to rename params in for tools
- Three test cases, including currency conversion
Fixed
- Excessive restrictions in OpenAI API
0.2.0 - Quick fixes
Added
- A couple of test cases
Fixed
- Error when tool is not used
0.1.0 - Hello world!
Initial release.