Handle LLM API requests in one place. Designed for speed and ease of use.
- ๐ฏ Simple: Minimal configuration required
- deepseek-v3 and deepseek-r1 Support
- ๐ Fast: Built on H3, a high-performance server framework
- ๐ Flexible: Support for multiple LLM providers
This package acts as a bridge (proxy) between DeepSeek and Cursor, allowing us to use Composer and ensuring it works as expected, like any other API model in Cursor
- Create an API Key at Deepseek Platform
- Signup for a free account on ngrok
Start a terminal
npx aiconn
or
npm -g aiconn
Cursor doesn't allow localhost as a base URL, so we need to create a reverse proxy. You can use Ngrok for example:
Start another terminal
ngrok http 6000
You will see your server address in the terminal
Setup Cursor like the following
Note: We can't use the real names for deepseek here since Cursor will throw an error. So we "emulate" it with gpt-4 and gpt-3.5-turbo
Cursor Model Name | Deepseek Model |
---|---|
gpt-3.5-turbo | deepseek-v3 |
gpt-4 | deepseek-r1 |
Option | Description | Default |
---|---|---|
--hostname |
Hostname to bind to | 0.0.0.0 |
--port |
Port to run server on | 6000 |
If you add the deepseek API and the "deepseek-reasoning" r1 model the experience is not really great:
- No active .cursorrules / "Rules for AI" allowed
- Only Chat support (No Composer)
MIT
- X/Twitter: @kregenrek
- Bluesky: @kevinkern.dev
- Learn Cursor AI: Ultimate Cursor Course
- Learn to build software with AI: AI Builder Hub
- codefetch - Turn code into Markdown for LLMs with one simple terminal command
- aidex A CLI tool that provides detailed information about AI language models, helping developers choose the right model for their needs.
- codetie - XCode CLI
unjs - for bringing us the best javascript tooling system