forked from 1rgs/claude-code-proxy
-
Notifications
You must be signed in to change notification settings - Fork 0
Expand file tree
/
Copy path.env.example
More file actions
33 lines (27 loc) · 1.3 KB
/
.env.example
File metadata and controls
33 lines (27 loc) · 1.3 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
# --- Tiered Model Configuration (Hybrid/Multi-provider) ---
# Each tier (Big, Middle, Small) can have its own provider and model.
# Tiers map to Claude model keywords: Opus (Big), Sonnet (Middle), Haiku (Small).
# If a tier-specific provider is not set, it falls back to PREFERRED_PROVIDER.
# BIG (Opus)
# BIG_MODEL_PROVIDER="google" # or "openai", "anthropic", "ollama", "lm-studio"
# BIG_MODEL="gemini-2.5-pro"
# BIG_MODEL_BASE_URL="http://remote-server:11434" # Optional: Custom base URL for BIG models
# MIDDLE (Sonnet) - Default: LM Studio Gemma 4 31b
# MIDDLE_MODEL_PROVIDER="lm-studio"
# MIDDLE_MODEL="gemma-4-31b-it"
# MIDDLE_MODEL_BASE_URL="http://localhost:1234/v1" # Optional: Custom base URL for MIDDLE models
# SMALL (Haiku)
# SMALL_MODEL_PROVIDER="ollama"
# SMALL_MODEL="gemma-4-e4b"
# SMALL_MODEL_BASE_URL="http://localhost:11434" # Optional: Custom base URL for SMALL models
# --- Provider Preference (Fallback) ---
# Controls which provider is used if not specified in tier settings.
PREFERRED_PROVIDER="openai"
OPENAI_BASE_URL="https://api.openai.com/v1"
# --- API Keys ---
ANTHROPIC_API_KEY="your-anthropic-api-key"
OPENAI_API_KEY="sk-..."
GEMINI_API_KEY="your-google-ai-studio-key"
# --- Local Provider Settings ---
OLLAMA_BASE_URL="http://localhost:11434"
LM_STUDIO_BASE_URL="http://localhost:1234/v1"