(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
-
Updated
Nov 1, 2023 - C#
(Work in Progress) A cross-platform desktop client for offline LlaMA-CPU
C# client for interacting with KoboldCpp through its native and OpenAI-compatible endpoints.
A unified .NET client library for running LLMs (Large Language Models) locally. LocalAI.NET provides a single, consistent API for interacting with popular local LLM providers like KoboldCpp, Ollama, LM Studio, and Text Generation WebUI.
C# client for interacting with Oobabooga's text-generation-webui through its OpenAI-compatible API endpoints.
C# client for interacting with LM Studio through its native and OpenAI-compatible endpoints.
Add a description, image, and links to the ai-client topic page so that developers can more easily learn about it.
To associate your repository with the ai-client topic, visit your repo's landing page and select "manage topics."