macai (macOS AI) is a simple yet powerful native macOS client made to interact with modern AI services (ChatGPT, Claude, xAI (Grok), Google Gemini, Perplexity, Ollama and almost any OpenAI-compatible APIs).
Download latest binary, notarized by Apple.
You can also support project on Gumroad.
Checkout main branch and open project in Xcode 14.3 or later
- Native macOS application built with SwiftUI for optimal performance and system integration
- Lightning fast search across all chats, messages, and personas
- Multi-LLM support including:
- OpenAI ChatGPT models (gpt-4o, o1-mini, o1-preview and other)
- Anthropic Claude
- Google Gemini
- xAI Grok
- Perplexity
- Local LLMs via Ollama
- Any OpenAI-compatible API
- AI Personas with customizable:
- System instructions
- Temperature settings
- Intelligent message handling:
- Streamed responses for real-time interaction
- Adjustable chat context size
- Automatic chat naming
- Rich content support:
- Syntax-highlighted code blocks
- Interactive HTML/CSS/JavaScript preview
- Formatted tables with CSV/JSON export
- LaTeX equation rendering
- 100% local data storage
- No telemetry or usage tracking
- Built-in backup/restore functionality with JSON export
- Complete control over API configurations and keys
- System-native light/dark theme support
- Per-chat customizable system instructions
- Clean, native macOS interface
- Minimal resource usage compared to Electron-based alternatives
To run macai with ChatGPT or Claude, you need to have an API token. API token is like password. You need to obtain the API token first to use any commercial LLM API. Most API services offer free credits on registering new account, so you can try most of them for free. Here is how to get API token for all supported services:
- OpenAI: https://help.openai.com/en/articles/4936850-where-do-i-find-my-secret-api-key
- Claude: https://docs.anthropic.com/en/api/getting-started
- Google Gemini: https://ai.google.dev/gemini-api/docs/api-key
- xAI Grok: https://docs.x.ai/docs#models
If you are new to LLM and don't want to pay for the tokens, take a look at Ollama. It supports dozens of OpenSource LLM models that can run locally on Apple M1/M2/M3/M4 Macs.
Run with Ollama
Ollama is the open-source back-end for various LLM models. Run macai with Ollama is easy-peasy:
- Install Ollama from the official website
- Follow installation guides
- After installation, select model (llama3.1 or llama3.2 are recommended) and pull model using command in terminal:
ollama pull <model>
- In macai settings, open API Service tab, add new API service and select type "ollama":
- Select model, and default AI Persona and save
- Test and enjoy!
macOS 13.0 and later (both Intel and Apple chips are supported)
Project is in the active development phase.
Contributions are welcome. Take a look at macai project page and Issues page to see planned features/bug fixes, or create a new one.
API Service, AI Persona and system message are customizable in any chat anytime