Connect data sources · Build workflows · Create dashboards · Automate operations
🚀 Quick Start • 🏗️ Architecture • ✨ Features • 🔧 Development Guide • 🤝 Contributing
Jet Admin is a modular, open-source internal tools platform that allows engineering and operations teams to connect data sources, build automation workflows, and create powerful dashboards — all from a single extensible system.
Unlike traditional BI tools, Jet Admin is designed around a workflow execution engine with a plugin-based datasource architecture. This makes it suitable not just for querying data, but for building automated pipelines, internal operations tools, and multi-step data processing flows.
The system evolved from a PostgreSQL admin tool into a full internal automation platform.
| Capability | Jet Admin | Traditional BI |
|---|---|---|
| Workflow Automation | ✅ Node-based engine | ❌ |
| Plugin Datasources | ✅ 25+ connectors | |
| Internal Tool Builder | ✅ Full widget system | ❌ |
| Self-Hostable | ✅ Docker ready | |
| AI Query Support | ✅ Built-in | ❌ |
| Multi-Tenancy | ✅ RBAC + Isolation |
Jet Admin supports 25+ datasource connectors through its package-driven plugin system.
Databases
- PostgreSQL · MySQL · MongoDB · SQLite · SQL Server · Redis · Neo4j
Cloud & SaaS
- BigQuery · Supabase · Firestore · Elasticsearch · AWS S3
APIs & Services
- REST · GraphQL · Slack · Stripe · Twilio · SendGrid
A visual node-based execution system for building multi-step automation pipelines.
Start → Query → Transform → Condition → Loop → Notify → End
Capabilities:
- ✅ Visual node-based builder
- ✅ Conditional branching logic
- ✅ Loop and iterator nodes
- ✅ Context-driven data passing
- ✅ Worker-based execution model
- ✅ Scheduled and triggered workflows
- ✅ Template variable resolution
Charts: Bar · Line · Pie · Radar · Scatter · Bubble
Data Widgets: Tables · Text · Markdown · HTML · Custom widgets
Features: Real-time updates · Export support · Responsive layouts · Drag-and-drop builder
Security: Multi-tenancy · RBAC · API authentication · Audit logs
User Management: Team invites · Permissions · Activity tracking
graph TD
A[Frontend - React] -->|REST / WebSocket| B[API Gateway]
B --> C[Module System]
C --> D[Workflow Engine]
D --> E[Execution Workers]
E --> F[Datasource Connectors]
F --> G[(PostgreSQL / External DBs)]
C --> H[Widget System]
C --> I[Dashboard Engine]
jet-admin/
├── apps/
│ ├── backend/ # Node.js API + Workflow Engine
│ │ ├── config/
│ │ ├── modules/
│ │ │ ├── datasource/
│ │ │ ├── dataQuery/
│ │ │ ├── workflow/
│ │ │ ├── widget/
│ │ │ └── dashboard/
│ │ ├── prisma/
│ │ └── utils/
│ └── frontend/ # React Application
│ └── src/
│ ├── data/ # API layer
│ ├── logic/ # Hooks, state, contexts
│ └── presentation/ # UI components only
│
├── packages/
│ ├── datasource-types/ # Shared connector contracts
│ ├── datasources-logic/ # Connector implementations
│ ├── widgets/ # Widget definitions
│ └── workflow-nodes/ # Node type definitions
│
└── docker-compose.cloud.yml
The backend uses a feature-module architecture instead of a traditional layered monolith. Each feature is fully isolated with its own controller, service, repository, and execution logic.
modules/
├── datasource/
│ ├── controller.js
│ ├── service.js
│ ├── repository.js
│ └── validation.js
├── workflow/
│ ├── controller.js
│ ├── service.js
│ ├── engine.js ← Workflow execution
│ ├── workers/ ← Node workers
│ └── repository.js
└── dataQuery/
├── controller.js
├── service.js
└── repository.js
Design principles enforced:
- ✅ Feature isolation — modules do not directly call each other
- ✅ Low coupling — changes in one module don't cascade
- ✅ High cohesion — all logic for a feature lives in its module
- ✅ Clear ownership — every file has a single responsibility
sequenceDiagram
participant U as User
participant E as Workflow Engine
participant R as Input Resolver
participant W as Worker
participant C as Context Store
U->>E: Trigger Workflow
E->>E: Load Node Definitions
E->>R: Resolve Node Inputs
R->>C: Read Context Variables
C-->>R: Return Values
R-->>E: Resolved Inputs
E->>W: Execute Worker(node, inputs)
W-->>E: Return Result
E->>C: Write Output to Context
E->>E: Determine Next Nodes
E-->>U: Execution Complete
The workflow engine enforces a strict contract between nodes, workers, and the engine.
| A Node MUST define | A Node MUST NOT do |
|---|---|
| Input schema | Execute business logic |
| Output schema | Mutate workflow context |
| UI configuration | Access the database directly |
| Worker reference | Call other modules |
| Validation rules | Resolve template variables |
Execution happens exclusively in workers. Nodes are metadata only.
Workers are the execution units of the workflow engine. This separation is intentional — it enables testability, strategy replacement, and future distributed execution.
Node Definition
↓
Input Resolution (Engine)
↓
Worker Execution
↓
Result Returned
↓
Context Updated (Engine)
↓
Next Nodes Triggered
Worker contract:
// ✅ Correct worker pattern
export async function executeQueryWorker(node, resolvedInputs, context) {
const result = await queryService.execute(
resolvedInputs.queryId,
resolvedInputs.params
);
return { result }; // Engine writes this to context
}Rules workers must follow:
- ✅ Return deterministic output
- ✅ Accept only pre-resolved inputs
- ❌ Never mutate context directly
- ❌ Never call the workflow engine
- ❌ Never resolve template variables internally
Workflow execution is driven by a context-propagation model. Each node reads from context and writes results back through the engine.
Context structure:
context = {
input: { customerId: 15 },
node_query_1: { result: [...] },
node_filter_2: { filtered: [...] }
}Input types supported:
| Type | Example |
|---|---|
| Literal | "value": 25 |
| Template | "value": "{{node_query_1.result}}" |
Resolution flow:
Detect Input Type
↓
If Literal → Return Value Immediately
↓
If Template → Parse Reference
↓
Resolve from Context
↓
Return Evaluated Value to Worker
Resolution always happens before worker execution — never inside workers.
Extensibility is achieved through package-driven plugins. The core system never depends on plugin implementations. Plugins depend on core contracts.
packages/
├── datasources-logic/
│ ├── postgres/
│ ├── mysql/
│ ├── mongodb/
│ └── ...
├── workflow-nodes/
│ ├── queryNode/
│ ├── transformNode/
│ └── conditionNode/
└── widgets/
├── barChart/
├── table/
└── ...
Every connector must implement the standard contract:
export class PostgresConnector {
async connect(config) { /* ... */ }
async disconnect() { /* ... */ }
async execute(query, params) { /* normalized result */ }
async validate(config) { /* boolean */ }
async healthCheck() { /* status */ }
}Rules:
- ✅ Return normalized results only
- ✅ Handle connection errors internally
- ❌ No workflow logic inside connectors
- ❌ No cross-connector dependencies
export const QueryNode = {
type: "dataQuery",
name: "Execute Query",
inputs: {
queryId: { type: "string", required: true },
params: { type: "object" }
},
outputs: {
result: "array"
},
worker: executeQueryWorker,
uiConfig: { /* form schema */ }
};export const BarChartWidget = {
type: "barChart",
configSchema: { /* JSON Schema */ },
render(data, config) {
return <BarChart data={data} options={config} />;
}
};{
"id": "sales-report-workflow",
"name": "Monthly Sales Report",
"nodes": [
{ "id": "start", "type": "start" },
{ "id": "query_1", "type": "dataQuery", "inputs": { "queryId": "fetchOrders" } },
{ "id": "filter_2", "type": "transform", "inputs": { "data": "{{query_1.result}}", "filter": "last30days" } },
{ "id": "agg_3", "type": "aggregate", "inputs": { "data": "{{filter_2.filtered}}", "field": "revenue" } },
{ "id": "chart_4", "type": "generateChart","inputs": { "data": "{{agg_3.aggregated}}", "type": "bar" } },
{ "id": "end", "type": "end" }
],
"edges": [
{ "from": "start", "to": "query_1" },
{ "from": "query_1", "to": "filter_2" },
{ "from": "filter_2","to": "agg_3" },
{ "from": "agg_3", "to": "chart_4" },
{ "from": "chart_4", "to": "end" }
]
}| Category | Convention | Example |
|---|---|---|
| Functions | camelCase | executeWorkflow() |
| Classes | PascalCase | WorkflowEngine |
| Constants | UPPER_SNAKE | MAX_RETRY_COUNT |
| Files | kebab-case | workflow-engine.js |
| DB columns | snake_case (Prisma mapped) | created_at |
// ✅ Good — single responsibility, deterministic
async function resolveNodeInputs(node, context) {
return node.inputs.map(input => resolveInput(input, context));
}
// ❌ Bad — mixed responsibilities
async function resolveAndExecuteAndStore(node, context) {
const inputs = resolve(node, context); // resolution
const result = await execute(inputs); // execution
await db.save(result); // storage
}// ✅ Correct — errors propagate with context
try {
await executeWorker(node, resolvedInputs);
} catch (error) {
logger.error({ nodeId: node.id, executionId, error });
throw new WorkflowExecutionError(node.id, executionId, error.message);
}
// ❌ Wrong — swallowed error, silent failure
try {
await executeWorker(node, resolvedInputs);
} catch (e) {
return null;
}Rule: Controller → Service → Repository → DB
Never: Controller → DB directly
Never: Worker → DB directly
Always: Use Prisma models only
Always: Wrap multi-step operations in transactions
Frontend is organized into three strict layers that must never be mixed:
src/
├── data/ # API calls, models, transformers
├── logic/ # Hooks, state, contexts
└── presentation/ # UI components only (no logic)
// ✅ Correct — component uses hook, hook uses service
const WorkflowList = () => {
const { workflows } = useWorkflows(); // logic layer
return workflows.map(w => <WorkflowCard w={w} />);
};
// ❌ Wrong — component calls API directly
const WorkflowList = () => {
const [workflows, setWorkflows] = useState([]);
useEffect(() => { axios.get('/api/workflows').then(setWorkflows); }, []);
};| Anti-Pattern | Why it's wrong |
|---|---|
| Business logic in controllers | Breaks testability and reuse |
| Workers mutating context | Breaks execution determinism |
| Nodes executing DB logic | Breaks separation of concerns |
| Template resolution inside workers | Engine responsibility, not worker |
| Cross-module direct calls | Creates hidden coupling |
| Hardcoded datasource logic | Breaks plugin isolation |
| Swallowing errors silently | Hides failures during debugging |
Jet Admin enforces these design decisions deliberately:
| Decision | Reason |
|---|---|
| Workers execute, nodes define | Enables strategy replacement and testability |
| Context is read-only for workers | Prevents hidden mutations and execution chaos |
| Inputs resolved before execution | Ensures predictable, deterministic worker behavior |
| Plugins are packages, not inline code | Enables independent versioning and release |
| Module isolation enforced | Prevents cascading failures and coupling |
These are not opinions — they are guarantees the system depends on.
The architecture is designed to support distributed execution in future iterations:
Current: Synchronous Worker Execution
Next: Queue-based worker dispatch (RabbitMQ ready)
Future: Distributed workflow runners + execution snapshots
Why workers remain pure matters: Stateless workers can be picked up by any runner — local, queued, or distributed — without code changes.
- Multi-tenancy: All queries scoped to tenant context
- RBAC: Role-based permissions enforced at service layer
- Credential storage: Datasource secrets encrypted at rest
- Query validation: All user-supplied queries validated before execution
- Execution sandboxing: Workers run in isolated execution contexts
- Node.js 18+
- PostgreSQL 14+
- Docker (recommended)
- Firebase project (for auth)
git clone https://github.com/Jet-labs/jet-admin.git
cd jet-admin
cp .env.docker.example .env.docker
docker-compose -f docker-compose.cloud.yml up -dAccess: http://localhost:3000
Backend:
cd apps/backend
npm install
cp .env.example .env
npx prisma migrate dev
npm run devFrontend:
cd apps/frontend
npm install
npm run devBackend .env:
DATABASE_URL=postgresql://user:pass@localhost:5432/jetadmin
JWT_SECRET=your_jwt_secret
FIREBASE_PROJECT_ID=your_project_id
RABBITMQ_URL=amqp://localhost
API_PORT=4000Frontend .env:
VITE_API_URL=http://localhost:4000
VITE_FIREBASE_CONFIG={"apiKey":"..."}feature/workflow-loop-node
fix/context-resolution-bug
docs/plugin-development-guide
refactor/worker-execution-pattern
feat: add loop execution node
fix: resolve context variable mutation issue
docs: add datasource connector guide
refactor: extract worker strategy pattern
test: add workflow engine unit tests
- No business logic in controllers
- Workers don't mutate context
- Modules remain isolated
- Error handling follows established pattern
- No console.log statements
- No unused imports
- Code follows naming conventions
- Tests added for new features
1. Fork the repository
2. Create a feature branch
3. Implement changes following coding standards
4. Write/update tests
5. Open a PR with description of changes
6. Address review feedback
| Document | Description |
|---|---|
| Workflow Engine | Execution lifecycle, context model, design decisions |
| Node Spec | Node definition contract, worker patterns |
| Plugin Development | Building datasource connectors and widget plugins |
| Execution Context | Context model, template resolution, immutability rules |
| Worker Design | Worker architecture, strategy patterns, scaling |
| Contribution Rules | Coding standards, PR process, anti-patterns |
- Workflow versioning and rollback
- Visual execution debugger
- Distributed worker execution
- AI-powered workflow builder
- Plugin marketplace
- Real-time execution tracing
- Partial workflow resume
- Execution snapshots
| Layer | Technology |
|---|---|
| Frontend | React 18, Tailwind CSS, React Flow, React Query |
| Backend | Node.js, Express, Prisma ORM |
| Database | PostgreSQL |
| Auth | Firebase |
| Messaging | RabbitMQ |
| Charts | Chart.js |
| Realtime | Socket.IO |
| Infra | Docker, Linux |
MIT License — see LICENSE for details.
Built with: React · Node.js · Prisma · React Flow · Chart.js · Socket.IO · PostgreSQL
⭐ Star this repo if it helps you