@wandgx/comfy-bridge is a standalone TypeScript router/adaptor layer for submitting ComfyUI workflows to:
- local ComfyUI instances
- ComfyUI Cloud
It keeps provider-specific transport details behind a stable package API and returns routing metadata that UI layers can surface directly.
This package is intentionally transport-focused.
It does:
- choose between
local,cloud, andautomodes - prefer a selected local instance when one is configured
- perform local-first routing in
auto - fall back to cloud for specific local failures when allowed
- normalize uploads, results, and errors across providers
- expose GUI-friendly metadata about which provider was actually used
It does not:
- generate workflows for you
- manage billing, projects, or WandGx business logic
- orchestrate load balancing or advanced scheduling
- guarantee identical semantics across every ComfyUI server build
- Node
>=18
npm install @wandgx/comfy-bridgeThis repo is designed to be a production-leaning bridge package, not a speculative demo.
The implementation and tests in this repo currently verify:
- local-first routing and cloud fallback rules
- stable
providerRequested/providerUsed/fallbackTriggered/fallbackReasonmetadata - local instance tracking through
localInstanceId - upload-to-workflow rewriting before submission
- normalized output parsing for local history and cloud job results
- normalized error classification for connection, auth, timeout, upload, cancel, and execution failures
- local progress via websocket with polling fallback
- cloud progress via polling, with websocket treated as best-effort
What is still intentionally bounded is listed in Known limitations.
This package currently relies on these local endpoints/behaviors:
GET /system_statsPOST /promptGET /history/:prompt_idPOST /upload/imageGET /viewGET /ws?clientId=...
Important local behavior in this package:
- submissions include a generated
client_id - websocket progress does not send an undocumented subscribe message
- if websocket progress fails, the adapter falls back to polling
history - file uploads are routed through
/upload/imagewithtype=inputrather than relying on/upload/file
This package now targets the documented Comfy Cloud base URL and auth shape:
- base URL:
https://cloud.comfy.org - auth header:
X-API-Key
This package currently uses these cloud endpoints:
GET /api/queuefor health/auth reachabilityPOST /api/promptGET /api/job/:id/statusGET /api/jobs/:idPOST /api/upload/imageGET /api/viewPOST /api/queuefor queued-job deletion
| Mode | Behavior |
|---|---|
local |
Use local only. Fail if a usable local instance is not configured. |
cloud |
Use cloud only. Fail if cloud is not configured or auth fails. |
auto |
Prefer local, then fall back to cloud only when configured and allowed. |
- no cloud preflight
- no fallback
localInstanceIdis the resolved local instance id
- no local preflight
- no fallback
localInstanceIdisundefined
The router does this:
- resolve the preferred enabled local instance
- health-check that local instance
- use local if healthy
- otherwise fall back to cloud only if
fallbackToCloudis enabled and cloud is configured
The router can also retry on cloud after local submission fails with a connection error when:
modeisautoretryOnConnectionFailureistrue- cloud is configured
The currently emitted fallback reasons are:
local_unhealthylocal_connection_failedlocal_timeoutlocal_submission_error
import { createComfyBridge } from '@wandgx/comfy-bridge';
const bridge = createComfyBridge({
mode: 'auto',
preferredLocalInstanceId: 'main-gpu',
fallbackToCloud: true,
retryOnConnectionFailure: true,
localTimeoutMs: 60_000,
localInstances: [
{
id: 'main-gpu',
name: 'Main GPU',
baseUrl: 'http://127.0.0.1:8188',
},
],
cloud: {
apiKey: process.env.COMFY_CLOUD_API_KEY,
},
});cloud.baseUrlis optional and defaults tohttps://cloud.comfy.org- keep your cloud API key in environment variables or other secure server-side config
- the bridge does not inject secrets into frontend code for you
import { createComfyBridge } from '@wandgx/comfy-bridge';
const bridge = createComfyBridge({
mode: 'auto',
fallbackToCloud: true,
retryOnConnectionFailure: true,
localTimeoutMs: 60_000,
localInstances: [
{ id: 'local-1', name: 'Local', baseUrl: 'http://127.0.0.1:8188' },
],
cloud: {
apiKey: process.env.COMFY_CLOUD_API_KEY,
},
});
const submission = await bridge.submitWorkflow({
workflow: {
'3': {
class_type: 'KSampler',
inputs: {},
},
},
});
console.log(submission.promptId);
console.log(submission.usage.providerUsed);
console.log(submission.usage.fallbackTriggered);
console.log(submission.usage.fallbackReason);Uploads are not cosmetic in this package.
Before submission:
- files and images are uploaded through the selected provider adapter
- provider upload responses are normalized to
{ filename, subfolder?, type? } - matching references inside the workflow JSON are rewritten before submit
That means this package does not upload a file and then submit stale workflow references.
- preferred path: websocket progress tied to the submit-time
client_id - fallback path: polling
GET /history/:prompt_id
- reliable path in this package: polling
GET /api/job/:id/status - websocket support is attempted when a runtime
WebSocketimplementation exists, but polling remains the trusted fallback path
Use this when you want the simpler public input shape.
const result = await bridge.submitWorkflow(
{
workflow,
files: [
{
name: 'input.png',
data: imageBlob,
contentType: 'image/png',
},
],
},
{
mode: 'auto',
}
);
console.log(result.promptId);
console.log(result.usage.providerUsed);Use this when you want explicit images and files arrays.
const result = await bridge.submit({
workflow,
images: [
{
filename: 'input.png',
data: imageBlob,
contentType: 'image/png',
},
],
});const result = await bridge.submitAndWait(
{ workflow },
{
onProgress(progress) {
console.log(progress.currentNode, progress.progress);
},
}
);
if (result.status === 'completed') {
console.log(result.outputs);
}const status = await bridge.getStatus('job-123', 'local', 'local-1');
console.log(status.state);
console.log(status.usage?.localInstanceId);Important:
getStatusis stateless- pass the provider and local instance you actually used, usually from prior usage metadata
GenerationStatus.statedoes not have acancelledvariant, so cancelled jobs are surfaced asfailedat this layer
const result = await bridge.getResult('job-123', 'cloud');
console.log(result.status);
console.log(result.outputs);await bridge.watchProgress('job-123', 'local', (progress) => {
console.log(progress.stepsCompleted, progress.totalSteps, progress.progress);
}, 'local-1');await bridge.cancel('job-123', 'local', 'local-1');Every submission returns GUI-friendly usage data.
interface ProviderUsageMetadata {
providerRequested: 'local' | 'cloud' | 'auto';
providerUsed: 'local' | 'cloud';
fallbackTriggered: boolean;
fallbackReason?: string;
localInstanceId?: string;
}Interpretation:
providerRequested: what you asked the router to doproviderUsed: what actually ran the jobfallbackTriggered: whether cloud was used as a fallback rather than the initial routefallbackReason: why that fallback happenedlocalInstanceId: the resolved local instance id, when local routing was involved
For direct getStatus() / getResult() calls, the bridge uses the provider and local instance id you pass in rather than trying to reconstruct original submission routing.
All thrown package errors are normalized into a ComfyBridgeError shape.
interface ComfyBridgeError {
code: ErrorCode;
message: string;
provider?: 'local' | 'cloud';
cause?: Error;
context?: Record<string, unknown>;
}Common error codes include:
NO_LOCAL_PROVIDERLOCAL_UNHEALTHYCLOUD_UNAVAILABLEAUTH_ERRORCONNECTION_ERRORTIMEOUT_ERRORINVALID_WORKFLOWUPLOAD_ERRORJOB_NOT_FOUNDCANCEL_ERROREXECUTION_ERRORINVALID_RESPONSE
Example:
import { isComfyBridgeError } from '@wandgx/comfy-bridge';
try {
await bridge.submitWorkflow({ workflow });
} catch (error) {
if (isComfyBridgeError(error)) {
console.error(error.code, error.provider, error.message, error.context);
}
}This package exposes two small helpers intended for provider-switcher UIs.
Returns configuration-facing state:
const state = bridge.getUISwitcherState();Shape:
modefallbackEnabledpreferredLocalUrl
Returns runtime-facing status:
const runtime = await bridge.getUISwitcherRuntimeInfo();Shape:
providerUsedstatusBadgefallbackReasonlastChecked
- cloud progress is trusted through status polling; websocket progress is best-effort
- cloud running-job cancellation is intentionally not performed as a targeted interrupt because the documented cloud API exposes queue deletion for queued jobs, but not a documented per-job interrupt for in-progress execution
- local queued-job cancellation uses the Comfy-compatible
/queuedeletion shape; exact behavior can still depend on the server build in front of you - this package does not preserve original submission routing automatically if you later call
getStatus()orgetResult()without the recorded provider metadata - output normalization covers common image/audio/video collections, not every possible custom node payload shape
npm run typecheck
npm test
npm run buildCurrent repository checks exercised during this hardening pass:
npm run typechecknpm test
- WandGx billing logic
- WandGx project management logic
- workflow authoring UX
- template systems
- multi-instance load balancing
- advanced scheduler/orchestrator behavior
MIT