Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 13 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
<div align="center">
<img src="img/team.png" alt="Pantheon agents" width="420">
<p><i>Six divine beings emerged from the dawn of code, each an immortal master of their craft await your command to forge order from chaos and build what was once thought impossible.</i></p>
<p><b>Multi Agent Suite</b> · Mix any models · Auto delegate tasks · Now with native Antigravity support</p>
<p><b>Multi Agent Suite</b> · Mix any models · Auto delegate tasks · Antigravity + Chutes ready</p>
</div>

---
Expand All @@ -14,6 +14,12 @@
bunx oh-my-opencode-slim@latest install
```

The installer can refresh and use OpenCode free models directly:

```bash
bunx oh-my-opencode-slim@latest install --no-tui --kimi=yes --openai=yes --antigravity=yes --chutes=yes --opencode-free=yes --opencode-free-model=auto --tmux=no --skills=yes
```

Then authenticate:

```bash
Expand All @@ -22,7 +28,12 @@ opencode auth login

Run `ping all agents` to verify everything works.

> **💡 Models are fully customizable.** Edit `~/.config/opencode/oh-my-opencode-slim.json` to assign any model to any agent. Supports Kimi, OpenAI, and Antigravity (Google) providers.
OpenCode free-model mode uses `opencode models --refresh --verbose`, filters to free `opencode/*` models, and applies coding-first selection:
- OpenCode-only mode can use multiple OpenCode free models across agents.
- Hybrid mode can combine OpenCode free models with OpenAI, Kimi, and/or Antigravity.
- In hybrid mode, `designer` stays on the external provider mapping.
- Chutes mode auto-selects primary/support models with daily-cap awareness (300/2000/5000).

> **💡 Models are fully customizable.** Edit `~/.config/opencode/oh-my-opencode-slim.json` (or `.jsonc` for comments support) to assign any model to any agent.

### For LLM Agents
Expand Down
18 changes: 8 additions & 10 deletions docs/antigravity.md
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@
The installer automatically:
- Adds `opencode-antigravity-auth@latest` plugin
- Configures Google provider with all Antigravity and Gemini CLI models
- Sets up agent mapping (Kimi/GPT for Orchestrator/Oracle, Antigravity for others)
- Sets up Antigravity-focused agent mapping presets

## Models Available

Expand Down Expand Up @@ -86,7 +86,7 @@ When you install with `--antigravity=yes`, the preset depends on other providers

### antigravity-mixed-both (Kimi + OpenAI + Antigravity)
- **Orchestrator**: Kimi k2p5
- **Oracle**: GPT-5.2-codex
- **Oracle**: OpenAI model
- **Explorer/Librarian/Designer/Fixer**: Gemini 3 Flash (Antigravity)

### antigravity-mixed-kimi (Kimi + Antigravity)
Expand All @@ -96,7 +96,7 @@ When you install with `--antigravity=yes`, the preset depends on other providers

### antigravity-mixed-openai (OpenAI + Antigravity)
- **Orchestrator**: Gemini 3 Flash (Antigravity)
- **Oracle**: GPT-5.2-codex
- **Oracle**: OpenAI model
- **Explorer/Librarian/Designer/Fixer**: Gemini 3 Flash (Antigravity)

### antigravity (Pure Antigravity)
Expand All @@ -106,22 +106,20 @@ When you install with `--antigravity=yes`, the preset depends on other providers

## Manual Configuration

If you prefer to configure manually, edit `~/.config/opencode/oh-my-opencode-slim.json`:
Edit `~/.config/opencode/oh-my-opencode-slim.json` (or `.jsonc`) and add the Antigravity preset:
If you prefer to configure manually, edit `~/.config/opencode/oh-my-opencode-slim.json` (or `.jsonc`) and add a pure Antigravity preset:

```json
{
"preset": "antigravity-mixed-both",
"preset": "antigravity",
"presets": {
"antigravity-mixed-both": {
"antigravity": {
"orchestrator": {
"model": "kimi-for-coding/k2p5",
"model": "google/antigravity-gemini-3-flash",
"skills": ["*"],
"mcps": ["websearch"]
},
"oracle": {
"model": "openai/gpt-5.2-codex",
"variant": "high",
"model": "google/antigravity-gemini-3-pro",
"skills": [],
"mcps": []
},
Expand Down
44 changes: 31 additions & 13 deletions docs/installation.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,19 +24,29 @@ bunx oh-my-opencode-slim@latest install
Or use non-interactive mode:

```bash
bunx oh-my-opencode-slim@latest install --no-tui --kimi=yes --openai=yes --antigravity=yes --tmux=no
bunx oh-my-opencode-slim@latest install --no-tui --kimi=yes --openai=yes --antigravity=yes --chutes=yes --opencode-free=yes --opencode-free-model=auto --tmux=no --skills=yes
```

### Provider Options

The installer supports multiple providers:
- **OpenCode Free Models**: Live-refreshed free `opencode/*` models
- **Kimi For Coding**: High-performance coding models
- **OpenAI**: GPT-4 and GPT-3.5 models
- **Antigravity (Google)**: Claude 4.5 and Gemini 3 models via Google's infrastructure
- **Chutes**: Free daily-capped models (`chutes/*`) with dynamic role-aware selection

When OpenCode free mode is enabled, the installer runs:

```bash
opencode models --refresh --verbose
```

It then filters to free `opencode/*` models only, picks a coding-first primary model, and picks a support model for search/implementation agents.

Enable during installation:
```bash
bunx oh-my-opencode-slim install --kimi=yes --openai=yes --antigravity=yes
bunx oh-my-opencode-slim install --kimi=yes --openai=yes --antigravity=yes --chutes=yes --opencode-free=yes --opencode-free-model=auto
```

### After Installation
Expand Down Expand Up @@ -83,33 +93,41 @@ Ask these questions **one at a time**, waiting for responses:
1. "Do you have access to **Kimi For Coding**?" *(Provides Kimi k1.5 models)*
2. "Do you have access to **OpenAI** API?" *(Enables `openai/` models)*
3. "Do you have access to **Antigravity (Google)**?" *(Enables `google/` models via Antigravity)*
4. "Do you want to use **Chutes**?" *(Enables `chutes/` models with daily-cap aware selection)*
5. "Do you want to use **OpenCode free models**?" *(Refreshes and selects from free `opencode/*` models)*

Help the user understand the tradeoffs:
- Kimi For Coding provides powerful k1.5 models for coding tasks.
- OpenCode free mode discovers the latest free `opencode/*` models using `opencode models --refresh --verbose`.
- OpenCode-only mode can assign more than one OpenCode model across agents.
- Hybrid mode can combine OpenCode free models with OpenAI, Kimi, and/or Antigravity.
- In hybrid mode, `designer` remains on the external provider mapping.
- Chutes selection prioritizes stronger models for orchestrator/oracle and higher-cap models for support agents.
- Kimi For Coding provides powerful coding models.
- OpenAI enables `openai/` models.
- Antigravity (Google) provides Claude 4.5 and Gemini 3 models.
- If the user has **no providers**, the plugin still works using **OpenCode Zen** free models (`opencode/big-pickle`). They can switch to paid providers later by editing `~/.config/opencode/oh-my-opencode-slim.json`.
- OpenAI is optional; it enables `openai/` models.
- If the user has **no providers**, the plugin still works using **OpenCode Zen** free models (`opencode/big-pickle`). They can switch to paid providers later by editing `~/.config/opencode/oh-my-opencode-slim.json` (or `.jsonc`).
- Antigravity (Google) provides Claude and Gemini models via Google infrastructure.
- Chutes provides free daily-capped models and requires `CHUTES_API_KEY`.

### Step 3: Run the Installer

Based on answers, run:

```bash
bunx oh-my-opencode-slim@latest install --no-tui --kimi=<yes|no> --openai=<yes|no> --antigravity=<yes|no>
bunx oh-my-opencode-slim@latest install --no-tui --kimi=<yes|no> --openai=<yes|no> --antigravity=<yes|no> --chutes=<yes|no> --opencode-free=<yes|no> --opencode-free-model=<id|auto> --tmux=<yes|no> --skills=<yes|no>
```

**Examples:**
```bash
# Kimi + OpenAI + Antigravity
bunx oh-my-opencode-slim@latest install --no-tui --kimi=yes --openai=yes --antigravity=yes --tmux=no
bunx oh-my-opencode-slim@latest install --no-tui --kimi=yes --openai=yes --antigravity=yes --chutes=yes --opencode-free=yes --opencode-free-model=auto --tmux=no --skills=yes

# OpenAI only
bunx oh-my-opencode-slim@latest install --no-tui --kimi=no --openai=yes --antigravity=no --tmux=no
bunx oh-my-opencode-slim@latest install --no-tui --kimi=no --openai=yes --antigravity=no --chutes=no --opencode-free=no --tmux=no --skills=yes

# OpenCode free models only (auto-select)
bunx oh-my-opencode-slim@latest install --no-tui --kimi=no --openai=no --antigravity=no --chutes=no --opencode-free=yes --opencode-free-model=auto --tmux=no --skills=yes

# No providers (Zen free models only)
bunx oh-my-opencode-slim@latest install --no-tui --kimi=no --openai=no --antigravity=no --tmux=no
# OpenCode free models + OpenAI (manual primary model)
bunx oh-my-opencode-slim@latest install --no-tui --kimi=no --openai=yes --antigravity=no --chutes=no --opencode-free=yes --opencode-free-model=opencode/gpt-5-nano --tmux=no --skills=yes
```

The installer automatically:
Expand Down Expand Up @@ -224,4 +242,4 @@ See the [Quick Reference](quick-reference.md#tmux-integration) for more details.
```bash
npx skills remove simplify
npx skills remove agent-browser
```
```
128 changes: 128 additions & 0 deletions docs/provider-combination-matrix.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,128 @@
# Provider Combination Test Matrix (2 to 6 Active)

This matrix tests 5 combinations across the 8 provider toggles in this project:

- `openai`
- `anthropic`
- `github-copilot`
- `zai-coding-plan`
- `kimi-for-coding`
- `google` (Antigravity/Gemini)
- `chutes`
- `opencode` free (`useOpenCodeFreeModels`)

## How this was determined

I generated outputs directly from `generateLiteConfig` in `src/cli/providers.ts` using fixed deterministic inputs:

- `selectedOpenCodePrimaryModel = opencode/glm-4.7-free`
- `selectedOpenCodeSecondaryModel = opencode/gpt-5-nano`
- `selectedChutesPrimaryModel = chutes/kimi-k2.5`
- `selectedChutesSecondaryModel = chutes/minimax-m2.1`

This represents the config output shape written by the installer when those selected models are available.

## Scenario S1 - 2 providers

Active providers: OpenAI + OpenCode Free

- Preset: `openai`
- Agents:
- `orchestrator`: `openai/gpt-5.3-codex`
- `oracle`: `openai/gpt-5.3-codex` (`high`)
- `designer`: `openai/gpt-5.1-codex-mini` (`medium`)
- `explorer`: `opencode/gpt-5-nano`
- `librarian`: `opencode/gpt-5-nano`
- `fixer`: `opencode/gpt-5-nano`
- Fallback chains:
- `orchestrator`: `openai/gpt-5.3-codex -> opencode/glm-4.7-free -> opencode/big-pickle`
- `oracle`: `openai/gpt-5.3-codex -> opencode/glm-4.7-free -> opencode/big-pickle`
- `designer`: `openai/gpt-5.1-codex-mini -> opencode/glm-4.7-free -> opencode/big-pickle`
- `explorer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> opencode/big-pickle`
- `librarian`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> opencode/big-pickle`
- `fixer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> opencode/big-pickle`

## Scenario S2 - 3 providers

Active providers: OpenAI + Chutes + OpenCode Free

- Preset: `openai`
- Agents:
- `orchestrator`: `openai/gpt-5.3-codex`
- `oracle`: `openai/gpt-5.3-codex` (`high`)
- `designer`: `openai/gpt-5.1-codex-mini` (`medium`)
- `explorer`: `opencode/gpt-5-nano`
- `librarian`: `opencode/gpt-5-nano`
- `fixer`: `opencode/gpt-5-nano`
- Fallback chains:
- `orchestrator`: `openai/gpt-5.3-codex -> chutes/kimi-k2.5 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `oracle`: `openai/gpt-5.3-codex -> chutes/kimi-k2.5 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `designer`: `openai/gpt-5.1-codex-mini -> chutes/kimi-k2.5 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `explorer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> chutes/minimax-m2.1 -> opencode/big-pickle`
- `librarian`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> chutes/minimax-m2.1 -> opencode/big-pickle`
- `fixer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> chutes/minimax-m2.1 -> opencode/big-pickle`

## Scenario S3 - 4 providers

Active providers: OpenAI + Copilot + ZAI Plan + OpenCode Free

- Preset: `openai`
- Agents:
- `orchestrator`: `openai/gpt-5.3-codex`
- `oracle`: `openai/gpt-5.3-codex` (`high`)
- `designer`: `openai/gpt-5.1-codex-mini` (`medium`)
- `explorer`: `opencode/gpt-5-nano`
- `librarian`: `opencode/gpt-5-nano`
- `fixer`: `opencode/gpt-5-nano`
- Fallback chains:
- `orchestrator`: `openai/gpt-5.3-codex -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `oracle`: `openai/gpt-5.3-codex -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `designer`: `openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `explorer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> opencode/big-pickle`
- `librarian`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> opencode/big-pickle`
- `fixer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> opencode/big-pickle`

## Scenario S4 - 5 providers

Active providers: OpenAI + Gemini + Chutes + Copilot + OpenCode Free

- Preset: `antigravity-mixed-openai`
- Agents:
- `orchestrator`: `chutes/kimi-k2.5`
- `oracle`: `openai/gpt-5.3-codex` (`high`)
- `designer`: `chutes/kimi-k2.5` (`medium`)
- `explorer`: `opencode/gpt-5-nano`
- `librarian`: `opencode/gpt-5-nano`
- `fixer`: `opencode/gpt-5-nano`
- Fallback chains:
- `orchestrator`: `chutes/kimi-k2.5 -> openai/gpt-5.3-codex -> github-copilot/grok-code-fast-1 -> google/antigravity-gemini-3-flash -> opencode/glm-4.7-free -> opencode/big-pickle`
- `oracle`: `openai/gpt-5.3-codex -> github-copilot/grok-code-fast-1 -> google/antigravity-gemini-3-pro -> chutes/kimi-k2.5 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `designer`: `chutes/kimi-k2.5 -> openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> google/antigravity-gemini-3-flash -> opencode/glm-4.7-free -> opencode/big-pickle`
- `explorer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> google/antigravity-gemini-3-flash -> chutes/minimax-m2.1 -> opencode/big-pickle`
- `librarian`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> google/antigravity-gemini-3-flash -> chutes/minimax-m2.1 -> opencode/big-pickle`
- `fixer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> github-copilot/grok-code-fast-1 -> google/antigravity-gemini-3-flash -> chutes/minimax-m2.1 -> opencode/big-pickle`

## Scenario S5 - 6 providers

Active providers: OpenAI + Anthropic + Copilot + ZAI Plan + Chutes + OpenCode Free

- Preset: `openai`
- Agents:
- `orchestrator`: `openai/gpt-5.3-codex`
- `oracle`: `openai/gpt-5.3-codex` (`high`)
- `designer`: `openai/gpt-5.1-codex-mini` (`medium`)
- `explorer`: `opencode/gpt-5-nano`
- `librarian`: `opencode/gpt-5-nano`
- `fixer`: `opencode/gpt-5-nano`
- Fallback chains:
- `orchestrator`: `openai/gpt-5.3-codex -> anthropic/claude-opus-4-6 -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> chutes/kimi-k2.5 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `oracle`: `openai/gpt-5.3-codex -> anthropic/claude-opus-4-6 -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> chutes/kimi-k2.5 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `designer`: `openai/gpt-5.1-codex-mini -> anthropic/claude-sonnet-4-5 -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> chutes/kimi-k2.5 -> opencode/glm-4.7-free -> opencode/big-pickle`
- `explorer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> anthropic/claude-haiku-4-5 -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> chutes/minimax-m2.1 -> opencode/big-pickle`
- `librarian`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> anthropic/claude-sonnet-4-5 -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> chutes/minimax-m2.1 -> opencode/big-pickle`
- `fixer`: `opencode/gpt-5-nano -> openai/gpt-5.1-codex-mini -> anthropic/claude-sonnet-4-5 -> github-copilot/grok-code-fast-1 -> zai-coding-plan/glm-4.7 -> chutes/minimax-m2.1 -> opencode/big-pickle`

## Notes

- This matrix shows deterministic `generateLiteConfig` output for the selected combinations.
- If the dynamic planner is used during full install (live model catalog), the generated `dynamic` preset may differ based on discovered models and capabilities.
27 changes: 25 additions & 2 deletions docs/quick-reference.md
Original file line number Diff line number Diff line change
Expand Up @@ -17,6 +17,28 @@ Complete reference for oh-my-opencode-slim configuration and capabilities.

Presets are pre-configured agent model mappings for different provider combinations. The installer generates these automatically based on your available providers, and you can switch between them instantly.

### OpenCode Free Discovery

The installer can discover the latest OpenCode free models by running:

```bash
opencode models --refresh --verbose
```

Selection rules:
- Only free `opencode/*` models are considered.
- A coding-first primary model is selected for orchestration/strategy workloads.
- A support model is selected for research/implementation workloads.
- OpenCode-only mode can assign multiple OpenCode models across agents.
- Hybrid mode can combine OpenCode free models with OpenAI/Kimi/Antigravity; `designer` remains on the external provider mapping.

Useful flags:

```bash
--opencode-free=yes|no
--opencode-free-model=<id|auto>
```

### Switching Presets

**Method 1: Edit Config File**
Expand Down Expand Up @@ -66,13 +88,14 @@ Access Claude 4.5 and Gemini 3 models through Google's Antigravity infrastructur

**Installation:**
```bash
bunx oh-my-opencode-slim install --antigravity=yes
bunx oh-my-opencode-slim install --antigravity=yes --opencode-free=yes --opencode-free-model=auto
```

**Agent Mapping:**
- Orchestrator: Kimi (if available)
- Oracle: GPT (if available)
- Explorer/Librarian/Designer/Fixer: Gemini 3 Flash via Antigravity
- If OpenCode free mode is enabled, Explorer/Librarian/Fixer may use selected free `opencode/*` support model while `designer` stays on external mapping

**Authentication:**
```bash
Expand Down Expand Up @@ -476,4 +499,4 @@ The installer generates this file based on your providers. You can manually cust
| `tmux.main_pane_size` | number | `60` | Main pane size as percentage (20-80) |
| `disabled_mcps` | string[] | `[]` | MCP server IDs to disable globally (e.g., `"websearch"`) |

> **Note:** Agent configuration should be defined within `presets`. The root-level `agents` field is deprecated.
> **Note:** Agent configuration should be defined within `presets`. The root-level `agents` field is deprecated.
2 changes: 1 addition & 1 deletion package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "oh-my-opencode-slim",
"version": "0.6.4",
"version": "0.7.0",
"description": "Lightweight agent orchestration plugin for OpenCode - a slimmed-down fork of oh-my-opencode",
"main": "dist/index.js",
"types": "dist/index.d.ts",
Expand Down
Loading