Skip to content

HyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.

License

Notifications You must be signed in to change notification settings

BigSweetPotatoStudio/HyperChat

Repository files navigation

中文 | English

Features

HyperChat is an open-minded Chat client that can use various LLM APIs to provide the best chat experience, as well as implement productivity tools through the MCP protocol.

  • Supports OpenAI-style LLMs, OpenAI, Claude(OpenRouter), Qwen, Deepseek, GLM, Ollama.
  • Built-in MCP plugin marketplace, user-friendly installation and configuration of MCP, one-click installation, and welcome to submit HyperChatMCP.
  • Also supports manual installation of third-party MCP, just fill out command, args, env.

MCP:

Build

  • 🪟Windows + 🍏MacOS
  • Supports nvm, see below
  • Supports Resources
  • Partial support for Prompts
  • Supports Tools
  • Supports English and Chinese
  • Added built-in MCP client hypertools, fetch + search
  • Supports Bot, can preset prompt words, allowed MCP services
  • Supports Artifacts, SVG, HTML rendering, JavaScript error capture, supports opening Chrome console
  • Bot display optimization, supports searching, dragging to sort
  • Supports KaTeX, displays mathematical formulas, code rendering enhances highlighting and quick copy
  • WebDAV synchronization
  • MCP extension marketplace + third-party MCP support
  • Added knowledge base

TODO:

  • Permission pop-up, allow or not
  • Support scheduled tasks
  • Support Projects + RAG
  • Implement tools that use LLM to write for itself
  • Local shell + nodejs + js on web runtime environment

LLM

LLM Ease of Use Notes
claude ⭐⭐⭐⭐⭐ No explanation
openai ⭐⭐⭐⭐🌙 Can also perfectly support multi-step function calls (gpt-4o-mini can too)
qwen ⭐⭐⭐⭐🌙 Very usable, feels better than OpenAI
doubao ⭐⭐⭐ Feels okay to use
deepseek ⭐⭐ Multi-step function calls may have issues

Usage

    1. Configure APIKEY, ensure your LLM service is compatible with OpenAI style.
    1. Ensure you have uv + nodejs installed on your system.

Install using the command line or check the official Github tutorial uv

# MacOS
brew install uv
# Windows
winget install --id=astral-sh.uv -e

Install using the command line or download from the official website, official nodejs

# MacOS
brew install node
# Windows
winget install OpenJS.NodeJS.LTS

Development

cd electron && npm install
cd web && npm install
npm install
npm run dev

Notes

  • On MacOS, if you encounter a damaged or permission issue, use sudo xattr -d com.apple.quarantine /Applications/HyperChat.app
  • For MacOS nvm users, manually input the PATH echo $PATH, Windows version of nvm seems to work directly

image.png

Telegram

HyperChat User Community

image.png

image.png

image.png

image.png

image.png

image.png

image.png

image.png

image.png

image.png

image.png

image.png

image.png

About

HyperChat is a Chat client that strives for openness, utilizing APIs from various LLMs to achieve the best Chat experience, as well as implementing productivity tools through the MCP protocol.

Topics

Resources

License

Stars

Watchers

Forks

Packages

No packages published