This guide covers the full installation and first-run flow for oc-codex-multi-auth.
Caution
This plugin is for personal development use with your own ChatGPT Plus/Pro subscription.
- It is not intended for commercial resale, shared multi-user access, or production services.
- It uses official OAuth authentication, but it is an independent open-source project and is not affiliated with OpenAI.
- For production applications, use the OpenAI Platform API.
| Requirement | Notes |
|---|---|
| OpenCode | Install from opencode.ai |
| ChatGPT Plus or Pro | Required for OAuth access and model entitlements |
| Node.js 20+ | Needed for local OpenCode runtime and plugin installation |
npx -y oc-codex-multi-auth@latest
opencode auth login
opencode run "Explain this repository" --model=openai/gpt-5.5 --variant=mediumThe installer updates ~/.config/opencode/opencode.json, backs up the previous config, normalizes the plugin entry to oc-codex-multi-auth, and clears the cached plugin copy so OpenCode reinstalls the latest package.
By default, the installer writes the compact UI config so the model picker shows base OAuth model families such as gpt-5.5 and gpt-5.5-fast, then the separate model variant picker handles none, low, medium, high, and xhigh. Rerunning the default installer also removes explicit preset entries and stale base models left by earlier plugin catalogs.
If you want direct explicit selector IDs such as openai/gpt-5.5-medium, use the full catalog:
npx -y oc-codex-multi-auth@latest --fullIf you explicitly want the older explicit-only layout, use:
npx -y oc-codex-multi-auth@latest --legacyUse this only when you want to develop or test the plugin locally.
git clone https://github.com/ndycode/oc-codex-multi-auth.git
cd oc-codex-multi-auth
npm ci
npm run buildPoint OpenCode at the built plugin:
{
"plugin": ["file:///absolute/path/to/oc-codex-multi-auth/dist"]
}Use the built dist/ directory, not the repository root.
Run:
opencode auth loginThen choose:
OpenAICodex OAuth (ChatGPT Plus/Pro)
The browser-based OAuth flow uses the same local callback port as Codex CLI. The authorize redirect is http://localhost:1455/auth/callback, while the local callback server binds http://127.0.0.1:1455/auth/callback and [::1]:1455 for dual-stack localhost redirects.
If you authenticated before the connector scopes were added, re-run opencode auth login. Current account records persist the granted OAuth scope and accounts missing api.connectors.read / api.connectors.invoke are marked for re-auth instead of being silently reused.
If you are on SSH, WSL, or another environment where the browser callback flow is inconvenient:
- rerun
opencode auth login - choose
Codex OAuth (Device Code) - open the verification link, enter the one-time code, and wait for login to finish
- if device code is unavailable on your auth server, fall back to
Codex OAuth (Manual URL Paste)
If you are not using the installer, edit ~/.config/opencode/opencode.json manually:
{
"$schema": "https://opencode.ai/config.json",
"plugin": ["oc-codex-multi-auth"]
}The repository ships two supported templates:
| OpenCode version | Template |
|---|---|
v1.0.210+ |
config/opencode-modern.json |
v1.0.209 and earlier |
config/opencode-legacy.json |
The templates include the supported GPT-5/Codex families, required store: false handling, and reasoning.encrypted_content for multi-turn sessions.
Current templates expose 9 shipped base model families and 36 shipped presets overall (36 modern variants or 36 legacy explicit entries).
On OpenCode v1.0.210+, the modern template intentionally shows 9 base model entries because the additional presets are selected through --variant instead of separate model keys.
gpt-5.5-pro is not shipped in the Codex templates because it is ChatGPT-only, not Codex-routable. Add entitlement-gated Spark variants manually only when your workspace supports them.
Run one of these commands:
# Recommended current GPT-5.5 path
opencode run "Create a short TODO list for this repo" --model=openai/gpt-5.5 --variant=medium
opencode run "Create a short TODO list for this repo" --model=openai/gpt-5.5-fast --variant=medium
opencode run "Inspect the retry logic and summarize it" --model=openai/gpt-5-codex --variant=high
# Direct selector IDs, only after installing with --full
opencode run "Create a short TODO list for this repo" --model=openai/gpt-5.5-mediumIf you want to verify request routing, run a request with logging enabled:
ENABLE_PLUGIN_REQUEST_LOGGING=1 opencode run "test" --model=openai/gpt-5.5 --variant=mediumThe first request should create logs under ~/.opencode/logs/codex-plugin/.
Use opencode debug config when you want to verify that template-defined or custom models were merged into your effective config. The default install exposes compact OAuth model entries such as gpt-5.5 and gpt-5.5-fast; --full additionally exposes explicit entries such as gpt-5.5-medium / gpt-5.5-fast-medium / gpt-5.5-high.
The plugin can manage multiple ChatGPT accounts and choose the healthiest account or workspace for each request.
After your first successful login, you can add more accounts by running opencode auth login again or by using the guided commands below.
These commands are useful after installation:
codex-setup
codex-help topic="setup"
codex-doctor
codex-next
Notes:
codex-switch,codex-label, andcodex-removecan show interactive account pickers whenindexis omitted in a supported terminal.- The plugin can show a startup preflight summary with the current account health state and suggested next step.
If you want conservative retry behavior while learning the workflow, enable beginner safe mode:
{
"beginnerSafeMode": true
}Or via environment variable:
CODEX_AUTH_BEGINNER_SAFE_MODE=1 opencodeThis mode forces a more conservative retry profile and reduces the chance of long retry loops while you are debugging setup issues.
From npm:
npx -y oc-codex-multi-auth@latestFrom a local clone:
git pull
npm ci
npm run build