Skip to content

feat: add defaultModelTier for per-expert provider-aware model selection#623

Merged
FL4TLiN3 merged 7 commits intomainfrom
feat/default-model-tier
Feb 25, 2026
Merged

feat: add defaultModelTier for per-expert provider-aware model selection#623
FL4TLiN3 merged 7 commits intomainfrom
feat/default-model-tier

Conversation

@FL4TLiN3
Copy link
Contributor

Summary

  • Add defaultModelTier field ("low" / "middle" / "high") to expert config in perstack.toml
  • Tier is automatically resolved to the appropriate model for the current provider (e.g., "low"claude-haiku-4-5 for Anthropic, gpt-5-nano for OpenAI)
  • Supports all 8 providers: anthropic, google, openai, deepseek, ollama, azure-openai, amazon-bedrock, google-vertex
  • CLI --model flag takes priority over expert-level tier
  • Delegation respects per-expert tiers (delegates can use different models than the parent)

Example usage

[experts."game-producer"]
defaultModelTier = "high"       # claude-opus-4-6 / gpt-5 / gemini-2.5-pro
instruction = "..."
delegates = ["pixel-artist"]

[experts."pixel-artist"]
defaultModelTier = "low"        # claude-haiku-4-5 / gpt-5-nano / gemini-2.5-flash-lite
instruction = "..."

Test plan

  • Unit tests for resolveModelTier() covering all providers and tiers
  • Unit tests for delegation executor verifying tier resolution and fallback
  • Typecheck passes (23/23 packages)
  • All 214 tests pass
  • Lint clean

🤖 Generated with Claude Code

FL4TLiN3 and others added 6 commits February 25, 2026 05:29
…selection

Allows each expert in perstack.toml to specify a model tier ("low", "middle",
"high") instead of a concrete model name. The tier is automatically resolved
to the appropriate model for the current provider (e.g., "low" → claude-haiku-4-5
for Anthropic, gpt-5-nano for OpenAI). CLI --model flag takes priority over tier.
Delegation also respects per-expert tiers.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Instead of a separate modelTierMap, each model in knownModels now has a
tier field. resolveModelTier() finds the first matching model for the
given provider and tier. Cloud-hosted providers fall back to their base
provider (azure-openai→openai, amazon-bedrock→anthropic, google-vertex→google).

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- gpt-5.2: high → middle (Instant variant, mid-tier)
- gpt-5.1: high → middle (mid-tier, gpt-4.1 successor)
- Reorder OpenAI models to put gpt-5.2-pro first (latest high-tier default)

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace hardcoded model names with defaultModelTier across all e2e TOML
configs and remove CLI --model injection from test infrastructure. Model
selection is now handled by perstack.toml tier resolution.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
@FL4TLiN3 FL4TLiN3 force-pushed the feat/default-model-tier branch from aa67ba3 to e3b2b54 Compare February 25, 2026 05:30
@FL4TLiN3 FL4TLiN3 enabled auto-merge (squash) February 25, 2026 06:27
@FL4TLiN3 FL4TLiN3 merged commit bf1b08a into main Feb 25, 2026
11 checks passed
@FL4TLiN3 FL4TLiN3 mentioned this pull request Feb 25, 2026
@FL4TLiN3 FL4TLiN3 deleted the feat/default-model-tier branch February 25, 2026 13:37
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant