feat: add defaultModelTier for per-expert provider-aware model selection#623
Merged
feat: add defaultModelTier for per-expert provider-aware model selection#623
Conversation
…selection
Allows each expert in perstack.toml to specify a model tier ("low", "middle",
"high") instead of a concrete model name. The tier is automatically resolved
to the appropriate model for the current provider (e.g., "low" → claude-haiku-4-5
for Anthropic, gpt-5-nano for OpenAI). CLI --model flag takes priority over tier.
Delegation also respects per-expert tiers.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Instead of a separate modelTierMap, each model in knownModels now has a tier field. resolveModelTier() finds the first matching model for the given provider and tier. Cloud-hosted providers fall back to their base provider (azure-openai→openai, amazon-bedrock→anthropic, google-vertex→google). Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
- gpt-5.2: high → middle (Instant variant, mid-tier) - gpt-5.1: high → middle (mid-tier, gpt-4.1 successor) - Reorder OpenAI models to put gpt-5.2-pro first (latest high-tier default) Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Replace hardcoded model names with defaultModelTier across all e2e TOML configs and remove CLI --model injection from test infrastructure. Model selection is now handled by perstack.toml tier resolution. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
aa67ba3 to
e3b2b54
Compare
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Summary
defaultModelTierfield ("low"/"middle"/"high") to expert config inperstack.toml"low"→claude-haiku-4-5for Anthropic,gpt-5-nanofor OpenAI)--modelflag takes priority over expert-level tierExample usage
Test plan
resolveModelTier()covering all providers and tiers🤖 Generated with Claude Code