A lightweight, high-performance Cloudflare Worker that proxies the Google Gemini API using its native OpenAI-compatible layer.
- Zero Latency: Direct pass-through using Cloudflare's global network.
- OpenAI Compatible: Use any OpenAI client library with Gemini models.
- Secure: Hide your API key from client applications by storing it as a Cloudflare Secret.
- Flexible: Supports Chat Completions, Embeddings, and Model Listing.
-
Deploy to Cloudflare:
npm install npx wrangler deploy
-
Add your API Key:
npx wrangler secret put GEMINI_API_KEY
Set your base URL to your worker's address (e.g., https://gemini-gateway.yourname.workers.dev/v1).
curl https://gemini-gateway.yourname.workers.dev/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "gemini-2.0-flash-exp",
"messages": [{"role": "user", "content": "Hello!"}]
}'