LLM plugin to access models available via the Venice AI API. Venice API access is currently in beta.
Install the LLM command-line utility, and install this plugin in the same environment as llm
:
llm install llm-venice
Set an environment variable LLM_VENICE_KEY
, or save a Venice API key to the key store managed by llm
:
llm keys set venice
Run a prompt:
llm --model venice/nous-theta-8b "Why is the earth round?"
Start an interactive chat session:
llm chat --model venice/llama-3.1-405b
Update the list of available models from the Venice API:
llm venice refresh
Read the llm
docs for more usage options.
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-venice
python3 -m venv venv
source venv/bin/activate
Install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
pytest