Skip to content

Commit 80af4fa

Browse files
feat: 完成 openai 协议兼容
1 parent 130b417 commit 80af4fa

11 files changed

Lines changed: 433 additions & 134 deletions

File tree

.github/workflows/ci.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@ on:
88

99
jobs:
1010
ci:
11-
runs-on: ubuntu-latest
11+
runs-on: macos-latest
1212

1313
steps:
1414
- uses: actions/checkout@v4

DEV-LOG.md

Lines changed: 49 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,54 @@
11
# DEV-LOG
22

3+
## OpenAI 接口兼容 (2026-04-03)
4+
5+
**分支**: `feature/openai`
6+
7+
`/login` 流程中新增 "OpenAI Compatible" 选项,支持 Ollama、DeepSeek、vLLM、One API 等兼容 OpenAI Chat Completions API 的第三方服务。用户通过 `/login` 配置后,所有 API 请求自动走 OpenAI 路径。
8+
9+
**改动文件(10 个,+384 / -134):**
10+
11+
| 文件 | 变更 |
12+
|------|------|
13+
| `.github/workflows/ci.yml` | CI runner 从 `ubuntu-latest` 改为 `macos-latest` |
14+
| `README.md` | TODO 列表新增 "OpenAI 接口兼容" 条目 |
15+
| `src/components/ConsoleOAuthFlow.tsx` | 新增 `openai_chat_api` OAuth state(含 Base URL / API Key / 3 个模型映射字段);idle 选择列表新增 "OpenAI Compatible" 选项;完整表单 UI(Tab 切换、Enter 保存);保存时写入 `modelType: 'openai'` + env 到 settings.json;OAuth 登录时重置 `modelType``anthropic` |
16+
| `src/services/api/openai/index.ts` | 从直接 `yield* adaptOpenAIStreamToAnthropic()` 改为完整流处理循环:累积 content blocks(text/tool_use/thinking)、按 `content_block_stop` yield `AssistantMessage`、同时 yield `StreamEvent` 用于实时显示;错误处理改用新签名 `createAssistantAPIErrorMessage({ content, apiError, error })` |
17+
| `src/services/api/openai/convertMessages.ts` | 输入类型从 Anthropic SDK `BetaMessageParam[]` 改为内部 `(UserMessage \| AssistantMessage)[]`;通过 `msg.type` 而非 `msg.role` 判断角色;从 `msg.message.content` 读取内容;跳过 `cache_edits` / `server_tool_use` 等内部 block 类型 |
18+
| `src/services/api/openai/modelMapping.ts` | 移除 `OPENAI_MODEL_MAP` JSON 环境变量 + 缓存机制;新增 `getModelFamily()` 按 haiku/sonnet/opus 分类;解析优先级改为:`OPENAI_MODEL``ANTHROPIC_DEFAULT_{FAMILY}_MODEL``DEFAULT_MODEL_MAP` → 原名透传 |
19+
| `src/services/api/openai/__tests__/convertMessages.test.ts` | 测试输入从裸 `{ role, content }` 改为 `makeUserMsg()` / `makeAssistantMsg()` 包装的内部格式 |
20+
| `src/services/api/openai/__tests__/modelMapping.test.ts` | 测试从 `OPENAI_MODEL_MAP` 改为 `ANTHROPIC_DEFAULT_{HAIKU,SONNET,OPUS}_MODEL`;新增 3 个 env var override 测试 |
21+
| `src/utils/model/providers.ts` | `getAPIProvider()` 新增最高优先级:从 settings.json `modelType` 字段判断;环境变量 `CLAUDE_CODE_USE_OPENAI` 降为次优先 |
22+
| `src/utils/settings/types.ts` | `SettingsSchema` 新增 `modelType` 字段:`z.enum(['anthropic', 'openai']).optional()` |
23+
24+
**关键设计决策:**
25+
26+
1. **`modelType` 存入 settings.json** — 而非纯环境变量,使 `/login` 配置持久化,重启后仍然生效
27+
2. **复用 `ANTHROPIC_DEFAULT_*_MODEL` 环境变量** — 而非新增 `OPENAI_MODEL_MAP`,与 Custom Platform 共用同一套模型映射配置,减少用户认知负担
28+
3. **流处理双 yield** — 同时 yield `AssistantMessage`(给消费方处理工具调用)和 `StreamEvent`(给 REPL 实时渲染),与 Anthropic 路径行为对齐
29+
4. **OAuth 登录重置 modelType** — 用户切换回官方 Anthropic 登录时自动重置为 `anthropic`,避免残留配置导致请求走错误路径
30+
31+
**配置方式:**
32+
33+
```
34+
/login → 选择 "OpenAI Compatible" → 填写 Base URL / API Key / 模型名称
35+
```
36+
37+
或手动编辑 `~/.claude/settings.json`
38+
39+
```json
40+
{
41+
"modelType": "openai",
42+
"env": {
43+
"OPENAI_BASE_URL": "http://localhost:11434/v1",
44+
"OPENAI_API_KEY": "ollama",
45+
"ANTHROPIC_DEFAULT_SONNET_MODEL": "qwen3:32b"
46+
}
47+
}
48+
```
49+
50+
---
51+
352
## Enable Remote Control / BRIDGE_MODE (2026-04-03)
453

554
**PR**: [claude-code-best/claude-code#60](https://github.com/claude-code-best/claude-code/pull/60)

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -36,6 +36,7 @@
3636
- [x] 添加自定义 GrowthBook 支持 (GB 也是开源的, 现在你可以配置一个自定义的遥控平台) [文档](https://ccb.agent-aura.top/docs/internals/growthbook-adapter)
3737
- [x] 自定义 login 模式, 大家可以用这个配置 Claude 的模型!
3838
- [x] 修复搜索工具的 rg 缺失问题(需要重新 bun i)
39+
- [ ] OpenAI 接口兼容! /login 然后配置 OpenAI 平台即可!
3940
- [ ] V6 大规模重构石山代码, 全面模块分包
4041
- [ ] V6 将会为全新分支, 届时 main 分支将会封存为历史版本
4142

src/components/ConsoleOAuthFlow.tsx

Lines changed: 127 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -38,6 +38,15 @@ type OAuthStatus = {
3838
opusModel: string;
3939
activeField: 'base_url' | 'api_key' | 'haiku_model' | 'sonnet_model' | 'opus_model';
4040
} // Custom platform: configure API endpoint and model names
41+
| {
42+
state: 'openai_chat_api';
43+
baseUrl: string;
44+
apiKey: string;
45+
haikuModel: string;
46+
sonnetModel: string;
47+
opusModel: string;
48+
activeField: 'base_url' | 'api_key' | 'haiku_model' | 'sonnet_model' | 'opus_model';
49+
} // OpenAI Chat Completions API platform
4150
| {
4251
state: 'ready_to_start';
4352
} // Flow started, waiting for browser to open
@@ -246,6 +255,8 @@ export function ConsoleOAuthFlow({
246255
if (!orgResult.valid) {
247256
throw new Error((orgResult as { valid: false; message: string }).message);
248257
}
258+
// Reset modelType to anthropic when using OAuth login
259+
updateSettingsForSource('userSettings', { modelType: 'anthropic' } as any);
249260
setOAuthStatus({
250261
state: 'success'
251262
});
@@ -416,6 +427,9 @@ function OAuthStatusMessage(t0) {
416427
t6 = [{
417428
label: <Text>Custom Platform ·{" "}<Text dimColor={true}>Configure your own API endpoint</Text>{"\n"}</Text>,
418429
value: "custom_platform"
430+
}, {
431+
label: <Text>OpenAI Compatible ·{" "}<Text dimColor={true}>Ollama, DeepSeek, vLLM, One API, etc.</Text>{"\n"}</Text>,
432+
value: "openai_chat_api"
419433
}, t4, t5, {
420434
label: <Text>3rd-party platform ·{" "}<Text dimColor={true}>Amazon Bedrock, Microsoft Foundry, or Vertex AI</Text>{"\n"}</Text>,
421435
value: "platform"
@@ -438,6 +452,17 @@ function OAuthStatusMessage(t0) {
438452
opusModel: process.env.ANTHROPIC_DEFAULT_OPUS_MODEL ?? "",
439453
activeField: "base_url"
440454
});
455+
} else if (value_0 === "openai_chat_api") {
456+
logEvent("tengu_openai_chat_api_selected", {});
457+
setOAuthStatus({
458+
state: "openai_chat_api",
459+
baseUrl: process.env.OPENAI_BASE_URL ?? "",
460+
apiKey: process.env.OPENAI_API_KEY ?? "",
461+
haikuModel: process.env.ANTHROPIC_DEFAULT_HAIKU_MODEL ?? "",
462+
sonnetModel: process.env.ANTHROPIC_DEFAULT_SONNET_MODEL ?? "",
463+
opusModel: process.env.ANTHROPIC_DEFAULT_OPUS_MODEL ?? "",
464+
activeField: "base_url"
465+
});
441466
} else if (value_0 === "platform") {
442467
logEvent("tengu_oauth_platform_selected", {});
443468
setOAuthStatus({
@@ -568,7 +593,7 @@ function OAuthStatusMessage(t0) {
568593
if (finalVals.haiku_model) env.ANTHROPIC_DEFAULT_HAIKU_MODEL = finalVals.haiku_model;
569594
if (finalVals.sonnet_model) env.ANTHROPIC_DEFAULT_SONNET_MODEL = finalVals.sonnet_model;
570595
if (finalVals.opus_model) env.ANTHROPIC_DEFAULT_OPUS_MODEL = finalVals.opus_model;
571-
const { error } = updateSettingsForSource('userSettings', { env } as any);
596+
const { error } = updateSettingsForSource('userSettings', { modelType: 'anthropic' as any, env } as any);
572597
if (error) {
573598
setOAuthStatus({ state: 'error', message: `Failed to save: ${error.message}`, toRetry: { state: 'custom_platform', baseUrl: '', apiKey: '', haikuModel: '', sonnetModel: '', opusModel: '', activeField: 'base_url' } });
574599
} else {
@@ -639,6 +664,107 @@ function OAuthStatusMessage(t0) {
639664
<Text dimColor>Tab to switch · Enter on last field to save · Esc to go back</Text>
640665
</Box>;
641666
}
667+
case "openai_chat_api":
668+
{
669+
type OpenAIField = 'base_url' | 'api_key' | 'haiku_model' | 'sonnet_model' | 'opus_model';
670+
const OPENAI_FIELDS: OpenAIField[] = ['base_url', 'api_key', 'haiku_model', 'sonnet_model', 'opus_model'];
671+
const op = oauthStatus as { state: 'openai_chat_api'; activeField: OpenAIField; baseUrl: string; apiKey: string; haikuModel: string; sonnetModel: string; opusModel: string };
672+
const { activeField, baseUrl, apiKey, haikuModel, sonnetModel, opusModel } = op;
673+
const openaiDisplayValues: Record<OpenAIField, string> = { base_url: baseUrl, api_key: apiKey, haiku_model: haikuModel, sonnet_model: sonnetModel, opus_model: opusModel };
674+
675+
const [openaiInputValue, setOpenaiInputValue] = useState(() => openaiDisplayValues[activeField]);
676+
const [openaiInputCursorOffset, setOpenaiInputCursorOffset] = useState(() => openaiDisplayValues[activeField].length);
677+
678+
const buildOpenAIState = useCallback((field: OpenAIField, value: string, newActive?: OpenAIField) => {
679+
const s = { state: 'openai_chat_api' as const, activeField: newActive ?? activeField, baseUrl, apiKey, haikuModel, sonnetModel, opusModel };
680+
switch (field) {
681+
case 'base_url': return { ...s, baseUrl: value };
682+
case 'api_key': return { ...s, apiKey: value };
683+
case 'haiku_model': return { ...s, haikuModel: value };
684+
case 'sonnet_model': return { ...s, sonnetModel: value };
685+
case 'opus_model': return { ...s, opusModel: value };
686+
}
687+
}, [activeField, baseUrl, apiKey, haikuModel, sonnetModel, opusModel]);
688+
689+
const doOpenAISave = useCallback(() => {
690+
const finalVals = { ...openaiDisplayValues, [activeField]: openaiInputValue };
691+
const env: Record<string, string> = {};
692+
if (finalVals.base_url) env.OPENAI_BASE_URL = finalVals.base_url;
693+
if (finalVals.api_key) env.OPENAI_API_KEY = finalVals.api_key;
694+
if (finalVals.haiku_model) env.ANTHROPIC_DEFAULT_HAIKU_MODEL = finalVals.haiku_model;
695+
if (finalVals.sonnet_model) env.ANTHROPIC_DEFAULT_SONNET_MODEL = finalVals.sonnet_model;
696+
if (finalVals.opus_model) env.ANTHROPIC_DEFAULT_OPUS_MODEL = finalVals.opus_model;
697+
const { error } = updateSettingsForSource('userSettings', { modelType: 'openai' as any, env } as any);
698+
if (error) {
699+
setOAuthStatus({ state: 'error', message: `Failed to save: ${error.message}`, toRetry: { state: 'openai_chat_api', baseUrl: '', apiKey: '', haikuModel: '', sonnetModel: '', opusModel: '', activeField: 'base_url' } });
700+
} else {
701+
for (const [k, v] of Object.entries(env)) process.env[k] = v;
702+
setOAuthStatus({ state: 'success' });
703+
void onDone();
704+
}
705+
}, [activeField, openaiInputValue, openaiDisplayValues, setOAuthStatus, onDone]);
706+
707+
const handleOpenAIEnter = useCallback(() => {
708+
const idx = OPENAI_FIELDS.indexOf(activeField);
709+
setOAuthStatus(buildOpenAIState(activeField, openaiInputValue));
710+
if (idx === OPENAI_FIELDS.length - 1) {
711+
doOpenAISave();
712+
} else {
713+
const next = OPENAI_FIELDS[idx + 1]!;
714+
setOpenaiInputValue(openaiDisplayValues[next] ?? '');
715+
setOpenaiInputCursorOffset((openaiDisplayValues[next] ?? '').length);
716+
}
717+
}, [activeField, openaiInputValue, buildOpenAIState, doOpenAISave, openaiDisplayValues, setOAuthStatus]);
718+
719+
useKeybinding('tabs:next', () => {
720+
const idx = OPENAI_FIELDS.indexOf(activeField);
721+
if (idx < OPENAI_FIELDS.length - 1) {
722+
setOAuthStatus(buildOpenAIState(activeField, openaiInputValue, OPENAI_FIELDS[idx + 1]));
723+
setOpenaiInputValue(openaiDisplayValues[OPENAI_FIELDS[idx + 1]!] ?? '');
724+
setOpenaiInputCursorOffset((openaiDisplayValues[OPENAI_FIELDS[idx + 1]!] ?? '').length);
725+
}
726+
}, { context: 'Tabs' });
727+
useKeybinding('tabs:previous', () => {
728+
const idx = OPENAI_FIELDS.indexOf(activeField);
729+
if (idx > 0) {
730+
setOAuthStatus(buildOpenAIState(activeField, openaiInputValue, OPENAI_FIELDS[idx - 1]));
731+
setOpenaiInputValue(openaiDisplayValues[OPENAI_FIELDS[idx - 1]!] ?? '');
732+
setOpenaiInputCursorOffset((openaiDisplayValues[OPENAI_FIELDS[idx - 1]!] ?? '').length);
733+
}
734+
}, { context: 'Tabs' });
735+
useKeybinding('confirm:no', () => {
736+
setOAuthStatus({ state: 'idle' });
737+
}, { context: 'Confirmation' });
738+
739+
const openaiColumns = useTerminalSize().columns - 20;
740+
741+
const renderOpenAIRow = (field: OpenAIField, label: string, opts?: { mask?: boolean }) => {
742+
const active = activeField === field;
743+
const val = openaiDisplayValues[field];
744+
return <Box>
745+
<Text backgroundColor={active ? 'suggestion' : undefined} color={active ? 'inverseText' : undefined}>{` ${label} `}</Text>
746+
<Text> </Text>
747+
{active
748+
? <TextInput value={openaiInputValue} onChange={setOpenaiInputValue} onSubmit={handleOpenAIEnter} cursorOffset={openaiInputCursorOffset} onChangeCursorOffset={setOpenaiInputCursorOffset} columns={openaiColumns} mask={opts?.mask ? "*" : undefined} focus={true} />
749+
: (val
750+
? <Text color="success">{opts?.mask ? val.slice(0, 8) + '·'.repeat(Math.max(0, val.length - 8)) : val}</Text>
751+
: null)}
752+
</Box>;
753+
};
754+
755+
return <Box flexDirection="column" gap={1}>
756+
<Text bold={true}>OpenAI Compatible API Setup</Text>
757+
<Text dimColor>Configure an OpenAI Chat Completions compatible endpoint (e.g. Ollama, DeepSeek, vLLM).</Text>
758+
<Box flexDirection="column" gap={1}>
759+
{renderOpenAIRow('base_url', 'Base URL ')}
760+
{renderOpenAIRow('api_key', 'API Key ', { mask: true })}
761+
{renderOpenAIRow('haiku_model', 'Haiku ')}
762+
{renderOpenAIRow('sonnet_model', 'Sonnet ')}
763+
{renderOpenAIRow('opus_model', 'Opus ')}
764+
</Box>
765+
<Text dimColor>Tab to switch · Enter on last field to save · Esc to go back</Text>
766+
</Box>;
767+
}
642768
case "waiting_for_login":
643769
{
644770
let t1;

0 commit comments

Comments
 (0)