Universal Cursor Proxy Gateway — Standalone OpenAI-compatible proxy for Cursor Pro subscription models.
Forked from Nomadcxx/opencode-cursor with a different focus: standalone proxy instead of OpenCode plugin.
A standalone HTTP proxy that provides OpenAI-compatible REST API access to Cursor Pro models. Works with any OpenAI-compatible client (curl, scripts, custom tools) without requiring OpenCode.
This vs Nomadcxx/opencode-cursor:
| CliCursorProxyAPI | Nomadcxx/opencode-cursor | |
|---|---|---|
| Purpose | Standalone proxy server | OpenCode plugin |
| Client support | Any OpenAI-compatible client | OpenCode only |
| Dependencies | None (just bun) | OpenCode + plugin system |
| Setup complexity | Low | Medium |
Client (curl/script/app)
│
▼
┌───────────────────────┐
│ CliCursorProxyAPI │ ← Standalone proxy on :32124
│ /v1/chat/completions│
└───────────────────────┘
│
▼
┌───────────────────────┐
│ cursor-agent CLI │ ← Handles auth, API communication
└───────────────────────┘
│
▼
┌───────────────────────┐
│ Cursor API │
└───────────────────────┘
git clone https://github.com/ThewindMom/CliCursorProxyAPI.git
cd CliCursorProxyAPI
bun install
bun run buildbun run proxycursor-agent logincurl http://localhost:32124/health
curl http://localhost:32124/v1/models
curl -X POST http://localhost:32124/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"auto","messages":[{"role":"user","content":"Hello"}],"stream":true}'curl http://localhost:32124/health
# {"status":"ok","version":"2.3.20","auth":"authenticated","mcp":{...}}curl http://localhost:32124/v1/modelsOpenAI-compatible streaming endpoint:
curl -X POST http://localhost:32124/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "auto",
"messages": [{"role": "user", "content": "Hello"}],
"stream": true
}'| Model | Description | Usage Pool |
|---|---|---|
auto |
Auto-select best model | Composer |
composer-2 |
Composer 2 (standard) | Composer |
composer-2-fast |
Composer 2 Fast | Composer |
composer-1.5 |
Composer 1.5 | Composer |
sonnet-4.6 |
Claude Sonnet 4.6 | API |
opus-4.6 |
Claude Opus 4.6 | API |
See /v1/models for full list.
# Health check
curl http://localhost:32124/health
# List models
curl http://localhost:32124/v1/models
# Chat completions
curl -X POST http://localhost:32124/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{"model":"auto","messages":[{"role":"user","content":"Hi"}],"stream":false}'Add to ~/.config/opencode/opencode.json:
{
"$schema": "https://opencode.ai/config.json",
"provider": {
"cursor-acp": {
"npm": "@ai-sdk/openai-compatible",
"name": "Cursor ACP",
"options": {
"baseURL": "http://127.0.0.1:32124/v1"
},
"models": {
"cursor-acp/auto": { "name": "Auto" },
"cursor-acp/sonnet-4.6": { "name": "Sonnet 4.6" }
}
}
}
}Then: opencode run --model "cursor-acp/cursor-acp/auto" "Your prompt"
Not supported. Claude Code uses Anthropic Messages API, not OpenAI format.
| Variable | Default | Description |
|---|---|---|
PORT |
32124 | Proxy port |
HOST |
127.0.0.1 | Bind address |
TOOL_LOOP_MAX_REPEAT |
2 | Max tool call repeats |
# Check port
lsof -i :32124
# Use different port
PORT=32125 bun run proxycursor-agent login
curl http://localhost:32124/health # should show "authenticated"# Use bare model name (no provider prefix)
curl -d '{"model":"sonnet-4.6",...}' # NOT "cursor-acp/sonnet-4.6"CliCursorProxyAPI/
├── src/
│ ├── proxy/
│ │ ├── server.ts # HTTP server
│ │ ├── handler.ts # Request routing
│ │ └── standalone-server.ts # Standalone entry point
│ ├── streaming/
│ │ ├── parser.ts # NDJSON → SSE conversion
│ │ ├── line-buffer.ts # Line buffering
│ │ └── delta-tracker.ts # Delta tracking
│ ├── auth.ts # Authentication
│ └── models/
│ └── sync.ts # Model list sync
├── docs/
│ ├── OPENCODE.md # OpenCode integration
│ └── FACTORY-DROID.md # Factory Droid integration
├── tests/ # Test suite
└── package.json
- Standalone proxy — Runs independently, no OpenCode required
- Minimal API — Only essential endpoints (health, models, chat)
- Clean build — Fixed TypeScript errors, proper error handling
- No plugin dependency — Works with any OpenAI-compatible client
- Factory Droid support — Service manifest for agent orchestration
This is a standalone proxy fork. The upstream has these features that we removed:
| Feature | Upstream | This Fork |
|---|---|---|
| One-line installer | curl ... install.sh | bash |
Not applicable |
| OpenCode plugin entry | dist/plugin-entry.js |
Not applicable |
| MCP tool bridge (mcptool) | Via mcptool CLI |
Not implemented |
| Model sync script | scripts/sync-models.sh |
Not applicable |
| Auto-install models | cursor-agent models sync |
Manual config |
OpenCode /auth integration |
Built-in | Manual proxy setup |
If you need the OpenCode plugin experience with automatic MCP bridging, use the upstream instead.
| Feature | CliCursorProxyAPI | Nomadcxx/opencode-cursor |
|---|---|---|
| Architecture | Standalone proxy | OpenCode plugin |
| Platform | Any | OpenCode only |
| Streaming | SSE | SSE |
| Tool calling | Via cursor-agent | Via cursor-agent |
| MCP bridge | Not implemented | Via mcptool |
ISC