[English | 中文]
This is a web UI for https://github.com/dzhng/deep-research, with several improvements and fixes.
Features:
- 🚀 Safe & Secure: Everything (config, API requests, ...) stays in your browser locally
- 🕙 Realtime feedback: Stream AI responses and reflect on the UI in real-time
- 🌳 Search visualization: Shows the research process using a tree structure. Supports searching in different languages
- 📄 Export as PDF: Export the final research report as Markdown / PDF
- 🤖 Supports more models: Uses plain prompts instead of newer, less widely supported features like Structured Outputs. This ensures to work with more providers that haven't caught up with the latest OpenAI capabilities.
- 🐳 Docker support: Deploy in your environment in one-line command
Currently available providers:
- AI: OpenAI compatible, DeepSeek, OpenRouter, Ollama
- Web Search: Tavily (1000 free credits / month), Firecrawl
Please give a 🌟 Star if you like this project!
showcase-25.02.16-v1.1.0.mp4
25/02/17
- Added: set rate limits for web search
- Added: set context length for AI model
25/02/16
- Refactored the search visualization using VueFlow
- Style & bug fixes
25/02/15
- Added AI providers DeepSeek, OpenRouter and Ollama; Added web search provider Firecrawl
- Supported checking project updates
- Supported regenerating reports
- General fixes
25/02/14
- Supported reasoning models like DeepSeek R1
- Improved compatibility with more models & error handling
25/02/13
- Significantly reduced bundle size
- Supported searching in different languages
- Added Docker support
- Fixed "export as PDF" issues
Live demo: https://deep-research.ataw.top
One-click deploy with EdgeOne Pages:
Use pre-built Docker image:
docker run -p 3000:3000 --name deep-research-web -d anotia/deep-research-web:latest
Use self-built Docker image:
git clone https://github.com/AnotiaWang/deep-research-web-ui
cd deep-research-web-ui
docker build -t deep-research-web .
docker run -p 3000:3000 --name deep-research-web -d deep-research-web
Make sure to install dependencies:
pnpm install
Start the development server on http://localhost:3000
:
pnpm dev
Build the application for production:
If you want to deploy a SSR application:
pnpm build
If you want to deploy a static, SSG application:
pnpm generate
Locally preview production build:
pnpm preview
Check out the deployment documentation for more information.
MIT