Skip to content

feat(provider): add ChatGPT Codex provider flow#397

Open
EngineerProjects wants to merge 1 commit intoGitlawb:mainfrom
EngineerProjects:feat/codex-provider
Open

feat(provider): add ChatGPT Codex provider flow#397
EngineerProjects wants to merge 1 commit intoGitlawb:mainfrom
EngineerProjects:feat/codex-provider

Conversation

@EngineerProjects
Copy link
Copy Markdown

Add Codex as a first-class provider in the provider manager, including reuse of detected local Codex login, built-in Codex model choices, and updated docs for the new setup path.

Summary

  • Added codex as a first-class provider type in the provider manager and provider profile config.

  • Exposed a dedicated ChatGPT / Codex preset in the modern provider manager UI.

  • Added detection of existing local Codex credentials so users can reuse a detected local Codex login instead of always entering a token manually.

  • Added a built-in Codex model selection step so users can choose known Codex model options without typing model IDs by hand.

  • Updated provider environment application logic so Codex profiles use CODEX_API_KEY instead of OPENAI_API_KEY.

  • Updated docs to explain the new Codex provider flow and local-login reuse path.

  • This changed because Codex support already existed in the runtime/backend, but it was not exposed cleanly in the modern provider-manager flow.

  • This makes the Codex path discoverable, easier to configure, and more aligned with how users actually authenticate when they already use the Codex CLI locally.

Impact

  • user-facing impact:

    • Users can now select ChatGPT / Codex directly in the provider manager.
    • Users can reuse an existing local Codex login when available.
    • Users can choose from built-in Codex model options instead of manually entering a model name.
    • Docs now describe the Codex setup path more clearly.
  • developer/maintainer impact:

    • Introduces codex as an explicit provider profile type in the provider manager flow.
    • Keeps Codex-specific environment handling isolated from generic OpenAI-compatible provider handling.
    • Adds focused tests covering Codex preset defaults and env application behavior.

Testing

  • bun run build
  • bun run smoke
  • bun run dev
  • focused tests:
    • bun test src/utils/providerProfiles.test.ts
    • bun test src/commands/provider/provider.test.tsx
    • bun test src/utils/providerProfile.test.ts

Notes

  • provider/model path tested:

    • Codex via https://chatgpt.com/backend-api/codex
    • default model path: codexplan
  • screenshots attached (if UI changed):

image image image
  • follow-up work or known limitations:
    • This PR improves provider-manager UX and local credential reuse, but does not add a browser-based ChatGPT/Codex login flow inside OpenClaude itself.
    • Model choices are built from known Codex options in the codebase, not from live discovery against the backend.

Add Codex as a first-class provider in the provider manager, including reuse of detected local Codex login, built-in Codex model choices, and updated docs for the new setup path.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@kevincodex1 kevincodex1 requested review from Vasanthdev2004 and gnanam1990 and removed request for gnanam1990 April 6, 2026 10:41
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The current model-option injection is too broad for Codex. On this head, Codex models are added for both openai and codex providers, which means ordinary OpenAI-compatible profiles can now surface Codex-only models. Direct repro: with CLAUDE_CODE_USE_OPENAI=1, OPENAI_BASE_URL=https://api.openai.com/v1, and OPENAI_MODEL=gpt-5.4, getAPIProvider() flips to codex, but
esolveProviderRequest() still keeps ransport: 'chat_completions' and �aseUrl: 'https://api.openai.com/v1'. So selecting a Codex model from a normal OpenAI profile silently pushes a Codex model string through a non-Codex backend. The Codex option list should only appear for actual Codex profiles, not every OpenAI-compatible provider.

Copy link
Copy Markdown
Collaborator

@Vasanthdev2004 Vasanthdev2004 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Rechecked the latest head 4c6b9475848cdf9e75277142ec4f3bdae760c805 against current origin/main.

I still can't approve this because there is one real provider-routing bug in the current implementation.

Current blocker:

  1. Codex model options are injected into ordinary OpenAI-compatible providers, which breaks provider semantics.
    In src/utils/model/modelOptions.ts, the branch adds getCodexModelOptions() for both openai and codex providers. That means normal OpenAI-compatible profiles (OpenAI API, Groq, DeepSeek, LM Studio, etc.) can now surface Codex-only models.

    Direct repro on this head:

    • CLAUDE_CODE_USE_OPENAI=1
    • OPENAI_BASE_URL=https://api.openai.com/v1
    • OPENAI_MODEL=gpt-5.4

    Results:

    • getAPIProvider() -> codex
    • resolveProviderRequest({ model: 'gpt-5.4' }) -> transport: 'chat_completions', baseUrl: 'https://api.openai.com/v1'

    So selecting a Codex model from a normal OpenAI profile silently flips provider detection to codex while still sending the request through a non-Codex backend. The Codex option list should only appear for actual Codex profiles, not every OpenAI-compatible provider.

What I verified on this head:

  • direct repro of the provider-flip behavior above on both OpenAI and another OpenAI-compatible base URL
  • Codex profile persistence itself is wired correctly (provider: 'codex' survives sanitization and applies CODEX_API_KEY instead of OPENAI_API_KEY)
  • local-login detection in ProviderManager is internally consistent with resolveCodexApiCredentials() on this machine
  • bun test src/utils/providerProfiles.test.ts src/commands/provider/provider.test.tsx src/utils/providerProfile.test.ts -> 53 pass
  • bun run build -> success
  • bun run smoke -> success

I didn't find a compile/runtime blocker outside that, but I wouldn't merge this until Codex-only model options stop appearing under generic OpenAI-compatible providers.

@gnanam1990
Copy link
Copy Markdown
Collaborator

Thanks for the work on this. The Codex provider flow is a nice addition, but I do not think this one is ready to land just yet.

The main blocker is that Codex-only model options are still being added under the generic OpenAI-compatible provider path. In src/utils/model/modelOptions.ts, getCodexModelOptions() is being pushed into the broader OpenAI-compatible option list, which means ordinary OpenAI-compatible profiles can start surfacing Codex-only models. That makes provider boundaries a bit too adventurous and can lead users into invalid provider/model combinations.

There is also a current CI failure on the branch, and the failing log shows new provider profile tests calling helpers that are not available in that test scope. So at the moment the PR is trying to ship a new provider flow while tripping over its own shoelaces in CI.

Please keep Codex model choices limited to actual Codex profiles, and fix the failing test imports/setup so the branch is green before merge.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants