File tree Expand file tree Collapse file tree
Expand file tree Collapse file tree Original file line number Diff line number Diff line change 11# DEV-LOG
22
3+ ## Datadog 日志端点可配置化 (2026-04-03)
4+
5+ 将 Datadog 硬编码的 Anthropic 内部端点改为环境变量驱动,默认禁用。
6+
7+ ** 修改文件:**
8+
9+ | 文件 | 变更 |
10+ | ------| ------|
11+ | ` src/services/analytics/datadog.ts ` | ` DATADOG_LOGS_ENDPOINT ` 和 ` DATADOG_CLIENT_TOKEN ` 从硬编码常量改为读取 ` process.env.DATADOG_LOGS_ENDPOINT ` / ` process.env.DATADOG_API_KEY ` ,默认空字符串;` initializeDatadog() ` 增加守卫:端点或 Token 未配置时直接返回 ` false ` |
12+ | ` docs/telemetry-remote-config-audit.md ` | 更新第 1 节,反映新的环境变量配置方式 |
13+
14+ ** 效果:** 默认不向任何外部发送数据;设置两个环境变量即可接入自己的 Datadog 实例。原有 ` DISABLE_TELEMETRY ` 、privacy level、sink killswitch 等防线保留。
15+
16+ ** 用法:** ` DATADOG_LOGS_ENDPOINT=https://http-intake.logs.datadoghq.com/api/v2/logs DATADOG_API_KEY=xxx bun run dev `
17+
18+ ---
19+
320## Sentry 错误上报集成 (2026-04-03)
421
522恢复反编译过程中被移除的 Sentry 集成。通过 ` SENTRY_DSN ` 环境变量控制,未设置时所有函数为 no-op,不影响正常运行。
Original file line number Diff line number Diff line change 44
55** 文件** : ` src/services/analytics/datadog.ts `
66
7- - ** 端点** : ` https://http-intake.logs.us5.datadoghq.com/api/v2/logs `
8- - ** 客户端 token** : ` pubbbf48e6d78dae54bceaa4acf463299bf `
7+ - ** 端点** : 通过环境变量 ` DATADOG_LOGS_ENDPOINT ` 配置(默认为空,即禁用)
8+ - ** 客户端 token** : 通过环境变量 ` DATADOG_API_KEY ` 配置(默认为空,即禁用)
99- ** 行为** : 批量发送日志(15s flush 间隔,100 条上限),仅限 1P(直连 Anthropic API)用户
1010- ** 事件白名单** : ` tengu_* ` 系列事件(启动、错误、OAuth、工具调用等 ~ 35 种)
1111- ** 基线数据** : 收集 model、platform、arch、version、userBucket(用户 hash 到 30 个桶)等
1212- ** 仅限** : ` NODE_ENV === 'production' `
13+ - ** 配置示例** : ` DATADOG_LOGS_ENDPOINT=https://http-intake.logs.datadoghq.com/api/v2/logs DATADOG_API_KEY=xxx bun run dev `
1314
1415## 2. 1P 事件日志(BigQuery)
1516
Original file line number Diff line number Diff line change @@ -9,9 +9,17 @@ import { MODEL_COSTS } from '../../utils/modelCost.js'
99import { isAnalyticsDisabled } from './config.js'
1010import { getEventMetadata } from './metadata.js'
1111
12+ /**
13+ * Datadog endpoint and token are configurable via environment variables.
14+ * If neither is set, Datadog logging is disabled entirely (no data sent).
15+ *
16+ * DATADOG_LOGS_ENDPOINT=https://http-intake.logs.datadoghq.com/api/v2/logs
17+ * DATADOG_API_KEY=<your-key>
18+ */
1219const DATADOG_LOGS_ENDPOINT =
13- 'https://http-intake.logs.us5.datadoghq.com/api/v2/logs'
14- const DATADOG_CLIENT_TOKEN = 'pubbbf48e6d78dae54bceaa4acf463299bf'
20+ process . env . DATADOG_LOGS_ENDPOINT ?? ''
21+ const DATADOG_CLIENT_TOKEN =
22+ process . env . DATADOG_API_KEY ?? ''
1523const DEFAULT_FLUSH_INTERVAL_MS = 15000
1624const MAX_BATCH_SIZE = 100
1725const NETWORK_TIMEOUT_MS = 5000
@@ -133,6 +141,12 @@ export const initializeDatadog = memoize(async (): Promise<boolean> => {
133141 return false
134142 }
135143
144+ // No custom endpoint configured — skip Datadog entirely
145+ if ( ! DATADOG_LOGS_ENDPOINT || ! DATADOG_CLIENT_TOKEN ) {
146+ datadogInitialized = false
147+ return false
148+ }
149+
136150 try {
137151 datadogInitialized = true
138152 return true
You can’t perform that action at this time.
0 commit comments