Copilot Proxy: A lightweight proxy that uses the Github Copilot CLI SDK to provide an OpenAI-compatible API endpoint to the local host. #218
rezrov
started this conversation in
Show and tell
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
https://github.com/rezrov/copilot-proxy
I put together this short script (roughly 270 lines of Node.js code) that uses this SDK to provide an OpenAI-compatible API endpoint to local processes. My organization currently does not allow direct access to LLM APIs, so this script is convenient for using tools like Fabric. The port is exposed on the local network interface only, since it does not require authentication and thus exposing the port to remote connections would be unwise. I've only been using it for about a day so far, but at this point it seems to be stable and performant enough for light local use.
Right now, the list of available models is hardcoded. I'd like the script to generate the list of available models dynamically, but I didn't see any examples showing how to do this via the SDK. If anyone has any suggestions on how to accomplish this, I'd like to make that change.
Beta Was this translation helpful? Give feedback.
All reactions