ollama-client
Here are 20 public repositories matching this topic...
LLMX; Easiest 3rd party Local LLM UI for the web!
-
Updated
Feb 4, 2025 - TypeScript
A Cross-platform AI chat application built with React Native and powered by Amazon Bedrock
-
Updated
Jan 26, 2025 - TypeScript
Chat with your pdf using your local LLM, OLLAMA client.(incomplete)
-
Updated
Oct 17, 2024 - TypeScript
Dive is an open-source AI Agent desktop application that seamlessly integrates any Tools Call-supported LLM with frontend MCP Server—part of the Open Agent Platform initiative.
-
Updated
Feb 7, 2025 - TypeScript
Frontend for the Ollama LLM, built with React.js and Flux architecture.
-
Updated
Jun 15, 2024 - TypeScript
Predictive Prompt is a simple Language Learning Model (LLM) chat window with retro styling. It dynamically populates a dropdown with available models from a local instance of Ollama and uses the streaming API to generate and display results in real-time. The output is rendered in markdown with syntax highlighting support.
-
Updated
Oct 14, 2024 - TypeScript
A Next.js chatbot interface integrated with the Ollama API. Features real-time AI responses, markdown formatting with syntax highlighting, and error handling for seamless user interaction.
-
Updated
Oct 17, 2024 - TypeScript
An efficient, high-speed, and elegantly designed client for LLM hosting platforms like Ollama..
-
Updated
Feb 1, 2025 - TypeScript
Ollama chat client powered by Tauri & Next.js
-
Updated
Feb 9, 2025 - TypeScript
A modern web interface for [Ollama](https://ollama.ai/), featuring a clean design and essential chat functionalities.
-
Updated
Dec 16, 2024 - TypeScript
Showcase Ollama chat response streaming to the web
-
Updated
Oct 6, 2024 - TypeScript
Ollama chat webui - AI Chatbot made with React, Vite, Nest.js, tailwind, shadcn & more
-
Updated
Oct 19, 2024 - TypeScript
Automatically translate comments in code to your preferred language using a local LLM.
-
Updated
Jan 5, 2025 - TypeScript
Improve this page
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
Add this topic to your repo
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."