Skip to content

pythaiml/Ollama-Gui

 
 

Repository files navigation

Ollama Chat App 🐐

Build and Deploy

Welcome to my Ollama Chat, this is an interface for the Official ollama CLI to make it easier to chat. It includes futures such as:

  • Multiple conversations 💬
  • Detech which models are available to use 📋
  • Auto check if ollama is running ⏰
  • Able to change the host where ollama is running at 🖥️
  • Perstistance 📀
  • Import & Export Chats 🚛
  • Light & Dark Theme 🌗


How to build on your machine

Requirements


  1. Clone the repo git clone git@github.com:ollama-interface/Ollama-Gui.git
  2. pnpm i
  3. pnpm build:app:silicon (:silicon or :intell or :universal) depending on your machine
  4. Go to /src-tauri/target/release/bundle/dmg/*.dmg and install the program with the .dmg file.

You as well you need to install Ollama and after you installed it, you can run your local server with this command OLLAMA_ORIGINS=* OLLAMA_HOST=127.0.0.1:11435 ollama serve.



For any questions, please contact Twan Luttik (Twitter - X)

Releases

No releases published

Packages

No packages published

Languages

  • TypeScript 96.0%
  • CSS 1.8%
  • JavaScript 1.1%
  • Other 1.1%