Question about minimal API codebase change: Making openAIAdapterStream protected (now private) #9145
Closed
dixoxib
started this conversation in
Models + Providers
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
File in question: core/llm/index.ts
Change:
private openAIAdapterStream→protected openAIAdapterStreamReason: Need access to raw API
finish_reasonfor full model feature integrationCurrently there's no way to access
finish_reasonfrom OpenAI-compatible adapters at the LLM implementation level. The cleanest solution for provider-specific streaming logic seems to be overridingopenAIAdapterStream.Any objections to this visibility change? If yes, what's the preferred approach for implementing provider-specific streaming logic that needs raw API response data?
Beta Was this translation helpful? Give feedback.
All reactions