-
Notifications
You must be signed in to change notification settings - Fork 373
fix: prevent duplicate listener clash on port 12001 and schema validation #790
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Changes from all commits
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -172,14 +172,7 @@ properties: | |
| type: string | ||
| provider_interface: | ||
| type: string | ||
| enum: | ||
| - arch | ||
| - claude | ||
| - deepseek | ||
| - groq | ||
| - mistral | ||
| - openai | ||
| - gemini | ||
| description: "The provider interface/name. For supported providers (openai, anthropic, gemini, etc.), specify as part of model name (e.g., 'openai/gpt-4'). For custom providers, provide the interface name here." | ||
|
Contributor
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I can appreciate this change - but I think for all first class providers we support we write out their upstream endpoints based on this enum. So if the developer offers a first-class provider that we don't support today then we would break because we wouldn't know which endpoint/cluster to send traffic to. We also support developers to provide custom providers and that shouldn't conflict with this list. Although I think you are correct in stating that the provider_interface field is probably misleading. Its only really used for custom model providers and not for first-class providers at all. I think there is a fix for this, but its not quite stripping out the enum changes. I'll have to think about that a bit more. |
||
| routing_preferences: | ||
| type: array | ||
| items: | ||
|
|
@@ -219,14 +212,7 @@ properties: | |
| type: string | ||
| provider_interface: | ||
| type: string | ||
| enum: | ||
| - arch | ||
| - claude | ||
| - deepseek | ||
| - groq | ||
| - mistral | ||
| - openai | ||
| - gemini | ||
| description: "The provider interface/name. For supported providers (openai, anthropic, gemini, etc.), specify as part of model name (e.g., 'openai/gpt-4'). For custom providers, provide the interface name here." | ||
| routing_preferences: | ||
| type: array | ||
| items: | ||
|
|
||
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Technically port 12001 is an internal port where we plugin all the WASM filters needed to process, mutate and handle LLM traffic. The users's port is defaulted to 12000 and the config allows you to update that. Because this is a reserved port, there is not an easy fix except that we export this port to the user via the config or pick something that is highly unlikely to conflict with the local dev environment.