-
Notifications
You must be signed in to change notification settings - Fork 46
Expand file tree
/
Copy path.env.example
More file actions
130 lines (106 loc) · 6 KB
/
.env.example
File metadata and controls
130 lines (106 loc) · 6 KB
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
# ──────────────────────────────────────────────
# BioClaw Environment Configuration
# Copy this file to .env and fill in your values
# ──────────────────────────────────────────────
# ─── Model Provider (choose one) ───────────────
# Option A: Anthropic (default) — Claude models
ANTHROPIC_API_KEY=sk-ant-api03-your-key-here
# Option B: OpenRouter — access Gemini, DeepSeek, Claude, GPT, etc. via one API
# Get key at https://openrouter.ai/keys
# MODEL_PROVIDER=openrouter
# OPENROUTER_API_KEY=sk-or-v1-your-key
# OPENROUTER_BASE_URL=https://openrouter.ai/api/v1
# OPENROUTER_MODEL=deepseek/deepseek-chat-v3.1
# Popular OpenRouter model IDs:
# deepseek/deepseek-chat-v3.1 DeepSeek V3.1
# deepseek/deepseek-v3.2 DeepSeek V3.2
# google/gemini-2.5-flash Gemini 2.5 Flash
# google/gemini-3-flash-preview Gemini 3 Flash
# anthropic/claude-3.5-sonnet Claude 3.5 Sonnet
# Full list: https://openrouter.ai/models
# Option C: Codex CLI login — reuse `codex login` / ChatGPT sign-in from this host
# No API key needed. BioClaw will reuse ~/.codex/auth.json and the installed Codex CLI.
# MODEL_PROVIDER=openai-codex
# OPENAI_CODEX_MODEL=gpt-5.4
# Option D: Gemini CLI — reuse `gemini auth` OAuth OR use a direct API key
# Two auth modes (tried in this order):
# 1. OAuth via ~/.gemini/oauth_creds.json (run `gemini auth` once on this host)
# 2. GEMINI_API_KEY from https://aistudio.google.com/apikey
# Both require the Gemini CLI (npm i -g @google/gemini-cli) on PATH.
# MODEL_PROVIDER=gemini
# GEMINI_MODEL=gemini-2.0-pro
# GEMINI_API_KEY=your-api-key-here # only needed if not using OAuth
# ─── Provider presets (ccswitch-style fast switching) ────────
# Presets live at ~/.config/bioclaw/provider-presets.json and can be
# managed from chat (/preset list|switch|save|delete|default) or the
# terminal (`npm run preset -- <subcommand>`). A default set is seeded
# automatically on first launch.
# ─── WhatsApp — Optional ─────────────────────
# All channels are opt-in. Set ENABLE_WHATSAPP=true to connect.
# WhatsApp uses QR code authentication via Baileys.
# No credentials needed — run `npm start` and scan the QR code
# with your WhatsApp app. Auth state is persisted in store/auth/.
# ENABLE_WHATSAPP=true
# ALLOW_WHATSAPP_SELF_MESSAGES=true
# ─── WeCom (企业微信) — Optional ─────────────
# To enable WeCom bot, create a Smart Robot (智能机器人) in WeCom admin console,
# select API mode with long connection, then fill in BotID and Secret below.
WECOM_BOT_ID=your-bot-id
WECOM_SECRET=your-secret
# Optional: self-built app for sending images/files (requires IP whitelist on server)
# WECOM_CORP_ID=your-corp-id
# WECOM_AGENT_ID=your-agent-id
# WECOM_CORP_SECRET=your-corp-secret
# ─── QQ Official Bot — Optional ───────────
# Current QQ support covers official QQ Bot text receive/reply over WebSocket.
# Supported inbound events: private chat messages and group @bot messages.
# QQ_APP_ID=your-app-id
# QQ_CLIENT_SECRET=your-client-secret
# QQ_SANDBOX=false
# ─── Feishu (Lark) — Optional ───────────────
# Current Feishu support covers text receive/reply.
# WebSocket mode is simplest if your bot app allows long connections.
# FEISHU_APP_ID=cli_xxx
# FEISHU_APP_SECRET=your-app-secret
# FEISHU_CONNECTION_MODE=websocket
# Only needed for webhook mode or encrypted event delivery:
# FEISHU_VERIFICATION_TOKEN=
# FEISHU_ENCRYPT_KEY=
# FEISHU_HOST=0.0.0.0
# FEISHU_PORT=8080
# FEISHU_PATH=/feishu/events
# ─── WeChat (个人号) — Optional ──────────────
# Uses weixin-agent-sdk (based on Tencent OpenClaw WeChat channel).
# QR code login in terminal, long-polling for messages.
# ENABLE_WECHAT=true
# ─── Discord — Optional ──────────────────────
# To enable Discord bot:
# 1. Create an application at https://discord.com/developers/applications
# 2. Add a Bot, copy the token
# 3. Enable "Message Content Intent" under Privileged Gateway Intents
# 4. Invite the bot to your server with Send Messages + Attach Files permissions
# DISCORD_BOT_TOKEN=your-bot-token
# ─── Slack — Optional (Socket Mode) ──────────
# BioClaw uses Slack Socket Mode (no public Request URL required).
# 1. Create an app at https://api.slack.com/apps
# 2. Enable Socket Mode → create an App-Level Token with scope connections:write (xapp-...)
# 3. Install the app to your workspace; copy Bot User OAuth Token (xoxb-...)
# 4. Event Subscriptions → subscribe to bot events: message.channels, message.groups,
# message.im, message.mpim (or legacy "message" where available)
# 5. OAuth & Permissions → Bot Token Scopes: channels:history, groups:history, im:history,
# mpim:history, chat:write, files:write (for images), users:read, channels:read
# SLACK_BOT_TOKEN=xoxb-your-bot-token
# SLACK_APP_TOKEN=xapp-your-app-token
# ─── Container Runtime ──────────────────────────
# Default: docker. Set to "apptainer" for HPC clusters without Docker.
# See docs/APPTAINER.md for setup instructions.
# CONTAINER_RUNTIME=docker
# CONTAINER_IMAGE=bioclaw-agent:latest # Docker tag or path to .sif file
# ─── Host-side SSH from chat — Optional ────────
# BioClaw can run SSH commands from the chat control plane on the host machine.
# By default it reads simple host aliases from ~/.ssh/config.
# To restrict BioClaw to a smaller allowlist, set:
# BIOCLAW_SSH_ALLOWED_HOSTS=lambda-cloud-54-140,hpc-login
# ─── Trace API auth (optional) ────────────────
# If set, trace/workspace API routes require Bearer token or ?token= query param.
# DASHBOARD_TOKEN=