Skip to content

feat: add Astrai as intelligent LLM router provider#30

Open
beee003 wants to merge 1 commit intoygwyg:mainfrom
beee003:feat/astrai-router-provider
Open

feat: add Astrai as intelligent LLM router provider#30
beee003 wants to merge 1 commit intoygwyg:mainfrom
beee003:feat/astrai-router-provider

Conversation

@beee003
Copy link
Copy Markdown

@beee003 beee003 commented Feb 19, 2026

Summary

  • Adds Astrai as a 4th LLM provider mode (LLM_PROVIDER=astrai) alongside openai-raw, ai-sdk, and cloudflare-gateway
  • Astrai is an AI inference router that automatically selects the optimal model/provider for each request based on cost, latency, and task complexity
  • Instead of locking into a single provider, Astrai routes across OpenAI, Anthropic, Google, Groq, DeepInfra, and more — finding the cheapest equivalent that meets quality requirements
  • OpenAI-compatible API (/v1/chat/completions), so it drops in cleanly alongside the existing providers

Why this is useful for MAHORAGA

MAHORAGA already supports 5+ LLM providers, but users still have to manually pick one. Astrai adds intelligent routing — set LLM_MODEL=auto and it picks the best model per request. For a trading agent that makes hundreds of LLM calls daily (research, classification, analysis), this can significantly reduce costs by routing simple sentiment classification to cheap models (Groq/Llama) while escalating complex analysis to frontier models (GPT-4o/Claude).

Three routing strategies:

Strategy Description
balanced Balance cost and quality (default)
cheapest Minimize cost while maintaining quality
fastest Minimize latency

Changes

File Change
src/providers/llm/astrai.ts New provider — OpenAI-compatible client for Astrai's /v1/chat/completions
src/providers/llm/factory.ts Add "astrai" case to provider factory + isLLMConfigured
src/providers/llm/index.ts Re-export AstraiProvider
src/env.d.ts Add ASTRAI_API_KEY, ASTRAI_STRATEGY env vars
README.md Document the new provider mode with config examples

Usage

npx wrangler secret put LLM_PROVIDER    # "astrai"
npx wrangler secret put LLM_MODEL       # "auto" or specific model
npx wrangler secret put ASTRAI_API_KEY   # sk-astrai-...
npx wrangler secret put ASTRAI_STRATEGY  # "balanced" (default), "cheapest", or "fastest"

Test plan

  • Verify existing providers (openai-raw, ai-sdk, cloudflare-gateway) still work unchanged
  • Test LLM_PROVIDER=astrai with a valid API key
  • Test LLM_MODEL=auto for automatic model selection
  • Test fallback when ASTRAI_API_KEY is not set (should return null gracefully)
  • Run existing test suite (npm test)

🤖 Generated with Claude Code

Adds a 4th LLM provider mode ("astrai") that routes requests through
the Astrai inference router for automatic model/provider selection
based on cost, latency, and task complexity.

Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Copy link
Copy Markdown
Contributor

@greptile-apps greptile-apps Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your free trial has ended. If you'd like to continue receiving code reviews, you can add a payment method here.

Copy link
Copy Markdown

@chatgpt-codex-connector chatgpt-codex-connector Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codex Review

Here are some automated review suggestions for this pull request.

Reviewed commit: b29234e290

ℹ️ About Codex in GitHub

Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you

  • Open a pull request for review
  • Mark a draft as ready
  • Comment "@codex review".

If Codex has suggestions, it will comment; otherwise it will react with 👍.

Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".

});
}

case "astrai": {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

P1 Badge Wire astrai through agent config validation

The new astrai branch in createLLMProvider is not reachable from the harness configuration flow: handleUpdateConfig validates updates with safeValidateAgentConfig (src/durable-objects/mahoraga-harness.ts:1193), but AgentConfigSchema still only allows "openai-raw" | "ai-sdk" | "cloudflare-gateway" (src/schemas/agent-config.ts:28). Because initializeLLM prefers state.config.llm_provider over env, users cannot persist astrai via the existing config API, so the new provider mode is effectively unavailable in that workflow.

Useful? React with 👍 / 👎.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants