Multi-agent workflow orchestration extension for pi coding agent.
Orchestrate complex multi-agent AI workflows directly from your pi terminal session. Define task pipelines with dependency ordering, run structured discovery interviews, generate execution plans, and dispatch parallel subagents — all as pi slash commands.
Built on patterns from oh-my-pi (swarm pipelines, DAG execution) and tallow (teams tool, subagent dispatch).
# As a pi package
pi install @sandalsoft/dorkestrator
# Or place in your project's extensions directory
cp -r src/ .pi/extensions/dorkestrator/| Command | Description |
|---|---|
/interview [topic] |
Run a structured discovery Q&A to gather project requirements |
/plan [description] |
Generate a multi-agent execution plan from interview answers or context |
/review |
Review the current plan (approve / modify / reject) |
/orchestrate |
Execute the approved plan using parallel subagents with DAG ordering |
/swarm run <file.yaml> |
Execute a YAML-defined multi-agent pipeline |
/swarm status |
Show current swarm execution status |
/dork-status |
Show lifecycle state, context, and plan |
/dork-reset |
Reset state for a new workflow |
/dork-load-plan <file.json> |
Load an execution plan from JSON |
/interview → /plan → /review → /orchestrate
- Interview — Structured Q&A collects project requirements
- Plan — LLM generates a task dependency graph
- Review — Approve, modify, or reject the plan
- Orchestrate — Conductor topologically sorts tasks into waves and dispatches subagents in parallel
Define multi-agent pipelines declaratively:
swarm:
name: "research-pipeline"
mode: pipeline # pipeline | parallel | sequential
agents:
researcher:
role: "Research Analyst"
task: "Research the topic thoroughly"
analyst:
role: "Data Analyst"
task: "Analyze the research findings"
waits_for: [researcher]
writer:
role: "Technical Writer"
task: "Write a comprehensive report"
waits_for: [analyst]Run with: /swarm run pipeline.yaml
Modes:
- pipeline — Implicit sequential ordering by declaration, explicit
waits_forfor overrides - parallel — All agents run concurrently unless constrained by
waits_for - sequential — Strict declaration-order execution
┌─────────────────────────────────────────┐
│ pi coding agent (ExtensionAPI) │
│ ┌───────────────────────────────────┐ │
│ │ dorkestrator extension │ │
│ │ ┌─────────┐ ┌───────────────┐ │ │
│ │ │Lifecycle │ │ Conductor │ │ │
│ │ │ Engine │ │ (DAG waves) │ │ │
│ │ └────┬─────┘ └──────┬───────┘ │ │
│ │ │ │ │ │
│ │ ┌────┴─────┐ ┌──────┴───────┐ │ │
│ │ │ Shared │ │ Subagent │ │ │
│ │ │ Context │ │ Executor │ │ │
│ │ └──────────┘ └──────────────┘ │ │
│ │ │ │
│ │ Commands: /interview /plan │ │
│ │ /review /orchestrate /swarm │ │
│ └───────────────────────────────────┘ │
└─────────────────────────────────────────┘
- Conductor — Builds execution waves from the task DAG via topological sort. Tasks in the same wave run in parallel; waves execute sequentially.
- Shared Context — Key-value store with write-on-complete semantics. Step results are namespaced as
step.<id>.output. - Lifecycle Engine — Event-sourced state machine: init → interview → planning → review → executing → completed.
- Swarm Definitions — YAML-driven multi-agent pipelines following oh-my-pi's schema pattern.
| dorkestrator concept | pi primitive |
|---|---|
| Slash commands | pi.registerCommand() |
| Task execution | pi.exec() spawning subagent processes |
| User interaction | ctx.ui.input(), ctx.ui.select(), ctx.ui.confirm() |
| Status updates | ctx.ui.notify(), ctx.ui.setWorkingMessage() |
| Session state | pi.sendMessage() for conversation-visible output |
| Session reset | pi.on("session_start") |
The core modules can also be used as a library, independent of pi:
import {
Conductor,
SharedContext,
buildExecutionWaves,
LifecycleEngine,
DEFAULT_PHASE_GRAPH,
parseSwarmYaml,
swarmToTaskDefinitions,
} from "@sandalsoft/dorkestrator";
// Build execution waves from tasks
const waves = buildExecutionWaves([
{ id: "a", dependsOn: [] },
{ id: "b", dependsOn: ["a"] },
{ id: "c", dependsOn: ["a"] },
{ id: "d", dependsOn: ["b", "c"] },
]);
// waves: [["a"], ["b", "c"], ["d"]]
// Execute with a custom task runner
const conductor = new Conductor({
executeTask: async (task, ctx, signal) => ({
id: task.id,
status: "completed",
output: `Done: ${task.task}`,
}),
maxConcurrency: 4,
});
await conductor.execute(tasks);npm install
npm test # 61 tests
npm run build # tsup → dist/
npm run typecheckMIT