NanoProxy is a local OpenAI-compatible bridge for NanoGPT that makes tool-enabled clients work more reliably by rewriting tool-enabled requests into a stricter upstream bridge protocol and translating the model output back into standard OpenAI-style content, reasoning, and tool_calls for the client.
It supports both:
- an OpenCode plugin
- a standalone local server for OpenAI-compatible tools such as Roo Code, Kilo Code, Zed, Cline-style clients, and similar editors or agents
By default NanoProxy uses an object bridge that asks the upstream model to emit one structured JSON turn object inside normal content. NanoProxy incrementally parses that object and converts it back into native client fields. An XML bridge is also available as an alternative protocol when needed.
It also supports optional native-first fallback for selected models through BRIDGE_MODELS.
For tool-enabled requests, NanoProxy:
- rewrites the upstream request into the selected bridge protocol
- preserves streaming where possible for reasoning, visible content, and tool calls
- incrementally parses the bridged model output
- converts it back into normal OpenAI-style response fields
- retries once for the specific invalid-empty bridged-turn case
Requests without tools pass through normally.
NanoProxy supports two bridge protocols for tool-enabled requests.
This is the current default.
NanoProxy asks the model to return one JSON turn object shaped like this:
{
"v": 1,
"mode": "tool",
"message": "I will inspect the relevant files now.",
"tool_calls": [
{
"name": "read",
"arguments": {
"path": "src/index.js"
}
}
]
}Field meaning:
v: protocol versionmode:tool,final, orclarifymessage: user-visible assistant texttool_calls: tool requests whenmodeistool
When the provider exposes reasoning separately, NanoProxy passes that through separately as reasoning content.
The XML bridge is also available as an alternative protocol:
$env:BRIDGE_PROTOCOL = "xml"
node server.jsIn XML mode, NanoProxy asks the model to emit tool actions in a narrow XML format inside normal content.
Example:
<open>I will inspect the relevant files now.</open>
<read>
<path>src/index.js</path>
</read>How it works:
<open>...</open>carries visible user-facing text- tool calls are emitted as XML tags named after the tool
- tool arguments are passed as child tags inside the tool tag
- NanoProxy parses those XML tool calls incrementally and converts them back into standard OpenAI-style
tool_calls - when batching is allowed, multiple tool tags can appear in the same response
BRIDGE_PROTOCOL selects which bridge protocol NanoProxy uses after it decides to bridge a tool-enabled request.
- not set: use
object object: use the object bridgexml: use the XML bridge
Examples:
$env:BRIDGE_PROTOCOL = "object"
opencode$env:BRIDGE_PROTOCOL = "xml"
node server.jsBRIDGE_PROTOCOL=object node server.jsBRIDGE_MODELS decides which models bridge immediately and which models try native mode first.
- not set: all tool-enabled requests bridge immediately
- set to an empty string: all tool-enabled requests try native-first, then fall back to the selected bridge protocol if needed
- set to a comma-separated list: matching models bridge immediately, other tool-enabled requests use native-first
Examples:
$env:BRIDGE_MODELS = ""
node server.js$env:BRIDGE_MODELS = "glm-5,kimi-k2.5"
opencodeBRIDGE_MODELS="glm-5,kimi-k2.5" node server.jsMatching is substring-based against the model id.
Configure OpenCode to load the plugin:
{
"plugin": [
"file:///path/to/NanoProxy/src/plugin.mjs"
]
}Windows example:
{
"plugin": [
"file:///C:/Users/you/path/to/NanoProxy/src/plugin.mjs"
]
}Then restart OpenCode.
Plugin logging is off by default.
Enable the structured session log for one run:
$env:NANOPROXY_DEBUG = "1"
opencodeEnable raw request and response artifacts too:
$env:NANOPROXY_DEBUG = "1"
$env:NANOPROXY_RAW_LOGS = "1"
opencodeOptional override:
NANOPROXY_LOG_DIRfor the plugin log directory
Default plugin log locations:
- session logs: system temp under
nanoproxy-plugin-logs - raw artifacts: system temp under
nanoproxy-plugin-logs/rawwhen raw logging is enabled
The .debug-logging file also enables debug logging.
Start the server:
node server.jsDefault address:
http://127.0.0.1:8787
Environment variables:
UPSTREAM_BASE_URL=https://nano-gpt.com/api/v1
PROXY_HOST=127.0.0.1
PROXY_PORT=8787
BRIDGE_PROTOCOL=object
BRIDGE_MODELS="glm-5,kimi-k2.5"
node server.jsServer logging is off by default.
Enable for one run:
$env:NANO_PROXY_DEBUG = "1"
node server.jsOr toggle persistently on Windows:
./toggle-debug.ps1Server logs are written to Logs/ as one structured session log per server run.
curl http://127.0.0.1:8787/healthExample response:
{
"ok": true,
"mode": "object-bridge",
"port": 8787,
"upstream": "https://nano-gpt.com/api/v1",
"debugLogs": false
}When debug logs are enabled, the response also includes logDir.
NanoProxy server mode works in Docker.
Build and run:
docker build -t nanoproxy .
docker run --rm -p 8787:8787 nanoproxyOr with Compose:
docker compose up --buildCompose uses the same environment model as the server, so you can add values like BRIDGE_PROTOCOL, BRIDGE_MODELS, or NANO_PROXY_DEBUG there when needed.
- off by default
- enabled by
NANOPROXY_DEBUG=1|trueor.debug-logging - raw artifacts additionally enabled by
NANOPROXY_RAW_LOGS=1|true - logs go to the temp folder under
nanoproxy-plugin-logs - one structured session log per plugin run
- off by default
- enabled by
NANO_PROXY_DEBUG=1|trueor.debug-logging - logs go to
Logs/ - one structured session log per server run
Key behavior:
- bridge activates only for tool-enabled requests
- requests without tools pass through unchanged
- object bridge is the default and the XML bridge remains available as an alternative protocol
- bridged output is converted back into normal OpenAI-style response fields
- invalid empty bridged turns are treated as protocol failures, not silent successes
- NanoProxy performs one retry for the specific invalid-empty bridged-turn case: no visible content and no tool call
- native-first passthrough is accepted only when the upstream response already looks structurally valid
- idle bridged SSE streams send keepalive comment frames so clients do not time out as quickly
NanoProxy/
|-- Dockerfile
|-- docker-compose.yml
|-- package.json
|-- README.md
|-- selftest.js
|-- server.js
|-- src/
| |-- core.js
| |-- object_bridge.js
| `-- plugin.mjs
`-- toggle-debug.ps1
node --check src/core.js
node --check src/object_bridge.js
node --check src/plugin.mjs
node --check server.js
node selftest.jsOr:
npm run check