Fork Notice: This project is forked from BenedictKing/claude-proxy v2.0.44 under MIT License.
Disclaimer: This repository is developed for personal use. Features are added based on personal needs and may not be suitable for all use cases. Use at your own risk.
A multi-provider AI proxy server with Web UI, supporting OpenAI/Claude protocol conversion, load balancing, and unified API access.
docker run -d \
--name cc-bridge \
-p 3000:3000 \
-e PROXY_ACCESS_KEY=your-secret-key \
-v $(pwd)/.config:/app/.config \
ghcr.io/jillvernus/cc-bridge:latestThen visit http://localhost:3000 and enter your key.
- All-in-One: Backend + Frontend in single container (replaces Nginx)
- Dual API: Claude Messages API (
/v1/messages) + Codex Responses API (/v1/responses) - Protocol Conversion: Auto-convert between Claude/OpenAI formats
- Multi-Provider: OpenAI, Claude, and compatible APIs
- Smart Scheduling: Priority routing, health checks, auto circuit-breaker
- Load Balancing: Round-robin, random, failover strategies
- Hot Reload: Config changes apply without restart
- Request Logs: SQLite storage, usage stats by model/provider, date filters
- Pricing System: Base prices, provider/model multipliers, token-type pricing
- UI Improvements: Redesigned header, better channel orchestration, Claude/Codex icons
User → Backend:3000 →
├─ / → Web UI (requires key)
├─ /api/* → Admin API (requires key)
├─ /v1/messages → Claude Messages API (requires key)
├─ /v1/responses → Codex Responses API (requires key)
└─ /health → Health check (public)
Pull from GHCR:
# Latest version
docker pull ghcr.io/jillvernus/cc-bridge:latest
# Specific version
docker pull ghcr.io/jillvernus/cc-bridge:v1.0.1Run with docker-compose:
git clone https://github.com/JillVernus/cc-bridge
cd cc-bridge
# Edit PROXY_ACCESS_KEY in docker-compose.yml
docker-compose up -dSupported architectures: linux/amd64, linux/arm64
git clone https://github.com/JillVernus/cc-bridge
cd cc-bridge
cp backend-go/.env.example backend-go/.env
# Edit backend-go/.env
make runVisit http://localhost:3000 → Enter key → Visual management
See ENVIRONMENT.md for all options.
Key variables:
| Variable | Description | Default |
|---|---|---|
PROXY_ACCESS_KEY |
Access key for all endpoints | (required) |
ENABLE_WEB_UI |
Enable Web UI | true |
LOG_LEVEL |
Log level (debug/info/warn/error) | info |
curl -X POST http://localhost:3000/v1/messages \
-H "x-api-key: your-key" \
-H "Content-Type: application/json" \
-d '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 100,
"messages": [{"role": "user", "content": "Hello!"}]
}'curl -X POST http://localhost:3000/v1/responses \
-H "x-api-key: your-key" \
-H "Content-Type: application/json" \
-d '{
"model": "gpt-5",
"max_tokens": 100,
"input": "Hello!"
}'Add "stream": true to the request body.
Use previous_response_id from the response to continue conversations.
Railway / Render / Fly.io / Zeabur
Railway:
# Connect GitHub repo, set environment variables:
PROXY_ACCESS_KEY=your-key
ENABLE_WEB_UI=true
ENV=productionFly.io:
fly launch --dockerfile Dockerfile
fly secrets set PROXY_ACCESS_KEY=your-key
fly deployRender / Zeabur: Connect GitHub repo → Set environment variables → Auto deploy
| Issue | Solution |
|---|---|
| Auth failed | Check PROXY_ACCESS_KEY is set correctly |
| Container won't start | Run docker-compose logs cc-bridge |
| Frontend 404 | Ensure ENABLE_WEB_UI=true, rebuild if needed |
| Port conflict | Check lsof -i :3000 |
Reset configuration:
docker-compose down
rm -rf .config/*
docker-compose up -d| Document | Description |
|---|---|
| ARCHITECTURE.md | Technical design, patterns, data flow |
| ENVIRONMENT.md | Environment variables reference |
| DEVELOPMENT.md | Development workflow, debugging |
| CONTRIBUTING.md | Contribution guidelines |
| CHANGELOG.md | Version history |
| RELEASE.md | Release process |
MIT License - see LICENSE
- BenedictKing/claude-proxy - Upstream project
- Anthropic - Claude API
- OpenAI - GPT API