feat: add Astrai as intelligent LLM router provider#30
feat: add Astrai as intelligent LLM router provider#30beee003 wants to merge 1 commit intoygwyg:mainfrom
Conversation
Adds a 4th LLM provider mode ("astrai") that routes requests through
the Astrai inference router for automatic model/provider selection
based on cost, latency, and task complexity.
Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: b29234e290
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| }); | ||
| } | ||
|
|
||
| case "astrai": { |
There was a problem hiding this comment.
Wire astrai through agent config validation
The new astrai branch in createLLMProvider is not reachable from the harness configuration flow: handleUpdateConfig validates updates with safeValidateAgentConfig (src/durable-objects/mahoraga-harness.ts:1193), but AgentConfigSchema still only allows "openai-raw" | "ai-sdk" | "cloudflare-gateway" (src/schemas/agent-config.ts:28). Because initializeLLM prefers state.config.llm_provider over env, users cannot persist astrai via the existing config API, so the new provider mode is effectively unavailable in that workflow.
Useful? React with 👍 / 👎.
Summary
LLM_PROVIDER=astrai) alongsideopenai-raw,ai-sdk, andcloudflare-gateway/v1/chat/completions), so it drops in cleanly alongside the existing providersWhy this is useful for MAHORAGA
MAHORAGA already supports 5+ LLM providers, but users still have to manually pick one. Astrai adds intelligent routing — set
LLM_MODEL=autoand it picks the best model per request. For a trading agent that makes hundreds of LLM calls daily (research, classification, analysis), this can significantly reduce costs by routing simple sentiment classification to cheap models (Groq/Llama) while escalating complex analysis to frontier models (GPT-4o/Claude).Three routing strategies:
balancedcheapestfastestChanges
src/providers/llm/astrai.ts/v1/chat/completionssrc/providers/llm/factory.ts"astrai"case to provider factory +isLLMConfiguredsrc/providers/llm/index.tsAstraiProvidersrc/env.d.tsASTRAI_API_KEY,ASTRAI_STRATEGYenv varsREADME.mdUsage
Test plan
openai-raw,ai-sdk,cloudflare-gateway) still work unchangedLLM_PROVIDER=astraiwith a valid API keyLLM_MODEL=autofor automatic model selectionASTRAI_API_KEYis not set (should returnnullgracefully)npm test)🤖 Generated with Claude Code