Language Server Protocol for accessing Large Language Models
-
Updated
Jan 2, 2025 - Rust
Language Server Protocol for accessing Large Language Models
An AI Toolbox for Simplified Access to AWS Bedrocks, Ollama from Rust
Rusty Ollama is a Rust client library for interacting with the Ollama API, providing both synchronous and streaming interfaces for working with large language models.
A terminal-based chat application that interfaces with Ollama
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."