Skip to content

Conversation

@82deutschmark
Copy link

This PR adds direct OpenAI Responses API support for modern reasoning models.

Key features:

  • OpenAIProvider class - Direct OpenAI provider using /v1/responses endpoint
  • Intelligent routing - GPT-5 and o-series models (o3, o4) automatically route to OpenAI directly
  • Reasoning defaults - Automatically sets reasoning.effort='medium' and reasoning.summary='detailed' for reasoning models
  • Model name normalization - Handles 'openai/gpt-4o' style namespaces gracefully

Why this matters:
OpenRouter's Responses API proxy returns empty output for GPT-5 and o-series models. This PR ensures these models work correctly by routing them directly to OpenAI when OPENAI_API_KEY is available.

Battle-tested in production.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant