Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
151 changes: 151 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,82 @@
bun run version
```

## Provider support

The restored CLI now has an initial multi-provider path for GitHub-backed inference.

Supported providers today:

- `github-models`: OpenAI-compatible GitHub Models endpoint
- `github-copilot`: GitHub Copilot account-backed endpoint for Copilot-hosted Claude models

Authentication lookup order for both providers is:

- provider-specific env var
- `GH_TOKEN`
- `GITHUB_TOKEN`
- `gh auth token`

Log in with GitHub CLI first if you want account-style auth instead of manually setting a token:

```bash
gh auth login
```

Use GitHub Models with the restored Claude Code runtime:

```bash
bun run dev --settings '{"provider":"github-models"}'
```

Or choose a specific GitHub Models model:

```bash
bun run dev --settings '{"provider":"github-models"}' --model "openai/gpt-4.1"
```

Use GitHub Copilot with the restored Claude Code runtime:

```bash
bun run dev --settings '{"provider":"github-copilot"}'
```

You can also switch providers interactively inside the CLI:

```text
/provider
```

The picker shows the available providers, saves the selection to user
settings, and then opens the existing model picker for the selected provider.

Useful shortcuts:

- `/provider` opens the visual provider picker
- `/provider info` shows the current provider and any active environment override
- `/provider github-copilot` switches directly to a specific provider
- `/model` still works after provider changes and only shows models supported by the active provider

Switch to a validated Copilot-hosted Claude model:

```bash
bun run dev --settings '{"provider":"github-copilot"}' --model "claude-opus-4.6"
```

Copilot-backed models currently validated with the Claude Code runtime and tool loop are:

- `claude-sonnet-4.6`
- `claude-opus-4.6`
- `claude-haiku-4.5`
- `claude-sonnet-4.5`
- `claude-opus-4.5`
- `claude-sonnet-4`

Current limitation:

- Copilot-hosted Claude models are working through the existing Claude Code agent/runtime path.
- Copilot-hosted GPT/Grok style models are not fully wired into the Claude Code runtime yet because they primarily require Copilot's `/responses` API path rather than the current chat/messages shim.

## 中文说明

# 还原后的 Claude Code 源码
Expand Down Expand Up @@ -131,3 +207,78 @@
```bash
bun run version
```

### Provider 支持

当前恢复版 CLI 已经补上了一条初步的多 provider 路径,可以接 GitHub 侧的推理服务。

目前可用的 provider:

- `github-models`:OpenAI-compatible 的 GitHub Models 接口
- `github-copilot`:基于 GitHub Copilot 账号态的接口,当前主要支持 Copilot 托管的 Claude 模型

这两个 provider 的鉴权查找顺序都是:

- provider 专用环境变量
- `GH_TOKEN`
- `GITHUB_TOKEN`
- `gh auth token`

如果你想走“GitHub 账号登录态”而不是手填 token,可以先执行:

```bash
gh auth login
```

使用 GitHub Models 跑恢复版 Claude Code runtime:

```bash
bun run dev --settings '{"provider":"github-models"}'
```

指定 GitHub Models 模型:

```bash
bun run dev --settings '{"provider":"github-models"}' --model "openai/gpt-4.1"
```

使用 GitHub Copilot 跑恢复版 Claude Code runtime:

```bash
bun run dev --settings '{"provider":"github-copilot"}'
```

现在也可以在 CLI 内通过交互式命令切换 provider:

```text
/provider
```

这个 picker 会列出当前可用 provider,把选择写入用户 settings,然后继续打开该 provider 对应的模型选择器。

常用命令:

- `/provider`:打开可视化 provider 选择器
- `/provider info`:显示当前 provider 和是否存在环境变量覆盖
- `/provider github-copilot`:直接切换到指定 provider
- `/model`:在切 provider 之后继续切模型,只显示当前 provider 支持的模型

切换到已经验证可用的 Copilot Claude 模型:

```bash
bun run dev --settings '{"provider":"github-copilot"}' --model "claude-opus-4.6"
```

当前已经验证能跑 Claude Code runtime 和工具循环的 Copilot 模型有:

- `claude-sonnet-4.6`
- `claude-opus-4.6`
- `claude-haiku-4.5`
- `claude-sonnet-4.5`
- `claude-opus-4.5`
- `claude-sonnet-4`

当前限制:

- Copilot 托管的 Claude 模型已经可以走现有 Claude Code agent/runtime 路径。
- Copilot 托管的 GPT/Grok 一类模型还没有完整接进来,因为它们主要需要走 Copilot 的 `/responses` API,而当前适配层主要还是 chat/messages 这条路径。
924 changes: 894 additions & 30 deletions bun.lock

Large diffs are not rendered by default.

28 changes: 28 additions & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -27,34 +27,58 @@
"@ant/computer-use-mcp": "file:./shims/ant-computer-use-mcp",
"@ant/computer-use-swift": "file:./shims/ant-computer-use-swift",
"@anthropic-ai/claude-agent-sdk": "*",
"@anthropic-ai/bedrock-sdk": "*",
"@anthropic-ai/foundry-sdk": "*",
"@anthropic-ai/mcpb": "*",
"@anthropic-ai/sandbox-runtime": "*",
"@anthropic-ai/sdk": "*",
"@anthropic-ai/vertex-sdk": "*",
"@aws-sdk/client-bedrock": "*",
"@aws-sdk/client-bedrock-runtime": "*",
"@aws-sdk/client-sts": "*",
"@aws-sdk/credential-provider-node": "*",
"@aws-sdk/credential-providers": "*",
"@azure/identity": "*",
"@commander-js/extra-typings": "*",
"@growthbook/growthbook": "*",
"@modelcontextprotocol/sdk": "*",
"@opentelemetry/api": "*",
"@opentelemetry/api-logs": "*",
"@opentelemetry/core": "*",
"@opentelemetry/exporter-logs-otlp-grpc": "*",
"@opentelemetry/exporter-logs-otlp-http": "*",
"@opentelemetry/exporter-logs-otlp-proto": "*",
"@opentelemetry/exporter-metrics-otlp-grpc": "*",
"@opentelemetry/exporter-metrics-otlp-http": "*",
"@opentelemetry/exporter-metrics-otlp-proto": "*",
"@opentelemetry/exporter-prometheus": "*",
"@opentelemetry/exporter-trace-otlp-grpc": "*",
"@opentelemetry/exporter-trace-otlp-http": "*",
"@opentelemetry/exporter-trace-otlp-proto": "*",
"@opentelemetry/resources": "*",
"@opentelemetry/sdk-logs": "*",
"@opentelemetry/sdk-metrics": "*",
"@opentelemetry/sdk-trace-base": "*",
"@opentelemetry/semantic-conventions": "*",
"@smithy/core": "*",
"@smithy/node-http-handler": "*",
"ajv": "*",
"asciichart": "*",
"audio-capture-napi": "file:./shims/audio-capture-napi",
"auto-bind": "*",
"axios": "*",
"bidi-js": "*",
"cacache": "*",
"chalk": "*",
"chokidar": "*",
"cli-highlight": "*",
"cli-boxes": "*",
"code-excerpt": "*",
"diff": "*",
"emoji-regex": "*",
"env-paths": "*",
"execa": "*",
"fflate": "*",
"figures": "*",
"fuse.js": "*",
"get-east-asian-width": "*",
Expand All @@ -64,23 +88,27 @@
"ignore": "*",
"indent-string": "*",
"ink": "*",
"image-processor-napi": "file:./shims/image-processor-napi",
"jsonc-parser": "*",
"lodash-es": "*",
"lru-cache": "*",
"marked": "*",
"p-map": "*",
"picomatch": "*",
"plist": "*",
"proper-lockfile": "*",
"qrcode": "*",
"react": "*",
"react-reconciler": "*",
"semver": "*",
"sharp": "*",
"shell-quote": "*",
"signal-exit": "*",
"stack-utils": "*",
"strip-ansi": "*",
"supports-hyperlinks": "*",
"tree-kill": "*",
"turndown": "*",
"type-fest": "*",
"undici": "*",
"usehooks-ts": "*",
Expand Down
11 changes: 11 additions & 0 deletions shims/audio-capture-napi/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
export {
isNativeAudioAvailable,
isNativePlaying,
isNativeRecordingActive,
microphoneAuthorizationStatus,
startNativePlayback,
startNativeRecording,
stopNativePlayback,
stopNativeRecording,
writeNativePlaybackData,
} from '../../vendor/audio-capture-src/index.ts'
6 changes: 6 additions & 0 deletions shims/audio-capture-napi/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"name": "audio-capture-napi",
"version": "0.0.0-restored",
"type": "module",
"main": "./index.ts"
}
2 changes: 2 additions & 0 deletions shims/image-processor-napi/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
export { default, getNativeModule, sharp } from '../../vendor/image-processor-src/index.ts'
export type { ClipboardImageResult, NativeModule } from '../../vendor/image-processor-src/index.ts'
6 changes: 6 additions & 0 deletions shims/image-processor-napi/package.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
{
"name": "image-processor-napi",
"version": "0.0.0-restored",
"type": "module",
"main": "./index.ts"
}
12 changes: 12 additions & 0 deletions src/cli/handlers/auth.ts
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,10 @@ import { isRunningOnHomespace } from '../../utils/envUtils.js'
import { errorMessage } from '../../utils/errors.js'
import { logError } from '../../utils/log.js'
import { getAPIProvider } from '../../utils/model/providers.js'
import {
getActiveProviderConfig,
getProviderDisplayName,
} from '../../utils/model/providerConfig.js'
import { getInitialSettings } from '../../utils/settings/settings.js'
import { jsonStringify } from '../../utils/slowOperations.js'
import {
Expand Down Expand Up @@ -249,6 +253,8 @@ export async function authStatus(opts: {
authMethod = 'third_party'
} else if (authTokenSource === 'claude.ai') {
authMethod = 'claude.ai'
} else if (authTokenSource === 'providerAuthToken') {
authMethod = 'provider_token'
} else if (authTokenSource === 'apiKeyHelper') {
authMethod = 'api_key_helper'
} else if (authTokenSource !== 'none') {
Expand Down Expand Up @@ -292,6 +298,8 @@ export async function authStatus(opts: {
}
} else {
const apiProvider = getAPIProvider()
const activeProvider = getActiveProviderConfig()
const providerName = getProviderDisplayName()
const resolvedApiKeySource =
apiKeySource !== 'none'
? apiKeySource
Expand All @@ -303,6 +311,10 @@ export async function authStatus(opts: {
authMethod,
apiProvider,
}
if (activeProvider.id !== apiProvider || providerName !== 'Anthropic') {
output.providerId = activeProvider.id
output.providerName = providerName
}
if (resolvedApiKeySource) {
output.apiKeySource = resolvedApiKeySource
}
Expand Down
7 changes: 6 additions & 1 deletion src/cli/print.ts
Original file line number Diff line number Diff line change
Expand Up @@ -1217,7 +1217,12 @@ function runHeadlessStreaming(
...(hasAutoMode && { supportsAutoMode: true }),
}
})
let activeUserSpecifiedModel = options.userSpecifiedModel
let activeUserSpecifiedModel =
options.userSpecifiedModel ?? getDefaultMainLoopModel()

if (!options.userSpecifiedModel) {
setMainLoopModelOverride(activeUserSpecifiedModel)
}

function injectModelSwitchBreadcrumbs(
modelArg: string,
Expand Down
2 changes: 2 additions & 0 deletions src/commands.ts
Original file line number Diff line number Diff line change
Expand Up @@ -175,6 +175,7 @@ import exportCommand from './commands/export/index.js'
import model from './commands/model/index.js'
import tag from './commands/tag/index.js'
import outputStyle from './commands/output-style/index.js'
import provider from './commands/provider/index.js'
import remoteEnv from './commands/remote-env/index.js'
import upgrade from './commands/upgrade/index.js'
import {
Expand Down Expand Up @@ -289,6 +290,7 @@ const COMMANDS = memoize((): Command[] => [
mobile,
model,
outputStyle,
provider,
remoteEnv,
plugin,
pr_comments,
Expand Down
16 changes: 16 additions & 0 deletions src/commands/provider/index.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
import type { Command } from '../../commands.js'
import { shouldInferenceConfigCommandBeImmediate } from '../../utils/immediateCommand.js'
import { getProviderDisplayName } from '../../utils/model/providerConfig.js'

export default {
type: 'local-jsx',
name: 'provider',
get description() {
return `Set the model provider for Claude Code (currently ${getProviderDisplayName()})`
},
argumentHint: '[provider]',
get immediate() {
return shouldInferenceConfigCommandBeImmediate()
},
load: () => import('./provider.js'),
} satisfies Command
Loading