Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -146,7 +146,6 @@ import { run } from "@perstack/runtime"

const checkpoint = await run({
setting: {
model: "claude-sonnet-4-5",
providerConfig: { providerName: "anthropic", apiKey: env.ANTHROPIC_API_KEY },
expertKey: "my-expert",
input: { text: query },
Expand Down
2 changes: 0 additions & 2 deletions docs/getting-started/walkthrough.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -276,7 +276,6 @@ import { run } from "@perstack/runtime"

const checkpoint = await run({
setting: {
model: "claude-sonnet-4-5-20250929",
providerConfig: { providerName: "anthropic" },
expertKey: "fitness-assistant",
input: { text: "Start today's session" },
Expand All @@ -291,7 +290,6 @@ import { run } from "@perstack/runtime"

const checkpoint = await run({
setting: {
model: "claude-sonnet-4-5-20250929",
providerConfig: { providerName: "anthropic" },
expertKey: "fitness-assistant",
input: { text: "Start today's session" },
Expand Down
1 change: 0 additions & 1 deletion docs/operating-experts/deployment.md
Original file line number Diff line number Diff line change
Expand Up @@ -92,7 +92,6 @@ export default {
await run(
{
setting: {
model: "claude-sonnet-4-5",
providerConfig: { providerName: "anthropic", apiKey: env.ANTHROPIC_API_KEY },
expertKey,
input: { text: query },
Expand Down
2 changes: 1 addition & 1 deletion docs/references/cli.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ Both `start` and `run` accept the following options:
| Option | Default | Description |
| ----------------------------- | --------------- | -------------------------------------------------------------------------------------- |
| `--provider <provider>` | `anthropic` | LLM provider |
| `--model <model>` | `claude-sonnet-4-5` | Model name |
| `--model <model>` | `auto` | Model name (auto-resolved from expert tier or provider's middle tier) |
| `--reasoning-budget <budget>` | - | Reasoning budget (`minimal`, `low`, `medium`, `high`, or token count) |

Providers: `anthropic`, `google`, `openai`, `deepseek`, `ollama`, `azure-openai`, `amazon-bedrock`, `google-vertex`
Expand Down
8 changes: 4 additions & 4 deletions docs/references/providers-and-models.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,12 @@ Perstack supports multiple LLM providers. Configure via CLI options, environment

## Default Model

Perstack uses `claude-sonnet-4-5` as the default model, selected based on:
When no model is specified, Perstack resolves it automatically using a tier-based system:

- **Standard tier pricing** — not a flagship model, economically sustainable for extended runs
- **High agentic performance** — demonstrated tool use capability in benchmarks like [𝜏²-Bench](https://artificialanalysis.ai/evaluations/tau2-bench)
1. **Expert's `defaultModelTier`** — if the expert definition sets a `defaultModelTier` (e.g. `high`, `middle`, `low`), the runtime picks the corresponding model from the provider
2. **Provider's middle tier** — if no tier is set, falls back to the provider's "middle" tier (e.g. `claude-sonnet-4-5` for Anthropic)

The default balances cost efficiency with reliable agent behavior. As new models are released, the default may change based on these criteria.
This ensures cost-efficient defaults while letting experts request more capable models when needed. You can always override the resolved model with `--model` via the CLI or `model =` in `perstack.toml`.

To override the default, specify in `perstack.toml`:

Expand Down
1 change: 0 additions & 1 deletion packages/runtime/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ import { run } from "@perstack/runtime"
const checkpoint = await run(
{
setting: {
model: "claude-sonnet-4-5",
providerConfig: { providerName: "anthropic", apiKey: "..." },
jobId: "job-123",
runId: "run-123",
Expand Down