A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
-
Updated
Oct 10, 2024 - Python
A snappy, keyboard-centric terminal user interface for interacting with large language models. Chat with ChatGPT, Claude, Llama 3, Phi 3, Mistral, Gemma and more.
✨ AI interface for tinkerers (Ollama, Haystack RAG, Python)
A Fun project using Ollama, Streamlit & PyShark to chat with PCAP/PCAPNG files locally, privately!
A command line utility that queries websites for answers using a local LLM
Simple CLI tool streamlines the process of managing AI models from the CivitAI platform. It offers functionalities to list available models, view their details, download selected variants, and remove models from local storage and provides a summary of the model description using Ollama or OpenAI.
Elia+ 👉 An experimental, snappy, and keyboard-centric UI for interacting with AI agents and augmenting humans using AI! Chat about any thing with any agent. ⚡
Desktop UI for Ollama made with PyQT
OllamaChat: A user-friendly GUI to interact with llama2 and llama2-uncensored AI models. Host them locally with Python and KivyMD. Installed Ollama for Windows. For more, visit Ollama on GitHub
Run Ollama models anywhere easily
Auto Install and Running Public API Service for Ollama with Any Model (library)
llamachan is a project that realises the idea of a dead internet for an imageboard
A simple Ollama Client using python streamlit .
A client library that makes it easy to connect microcontrollers running MicroPython to the Ollama server
A DBMS project with Streamlit Frontend for stock management simulation with Backtesting.
CLI tool for chatting with llms running via ollama written using modern python tooling. The tool focuses on enhancing nvim based coding workflow for free cuz author is a damn cheapskate.
This project is a simple yet powerful chatbot that leverages Gradio for the user interface, NLTK for natural language processing, SentenceTransformers for retrieval-based information extraction, and Ollama for advanced language model integration. It interacts with a locally hosted API to generate responses, making it suitable for various use cases.
Chat with local LLM about your PDF and text documents, privacy ensured [llamaindex and llama3]
Add a description, image, and links to the ollama-client topic page so that developers can more easily learn about it.
To associate your repository with the ollama-client topic, visit your repo's landing page and select "manage topics."