Connect data sources Β· Build workflows Β· Create dashboards Β· Automate operations
π Quick Start β’ ποΈ Architecture β’ β¨ Features β’ π§ Development Guide β’ π€ Contributing
Jet Admin is a modular, open-source internal tools platform that allows engineering and operations teams to connect data sources, build automation workflows, and create powerful dashboards β all from a single extensible system.
Unlike traditional BI tools, Jet Admin is designed around a workflow execution engine with a plugin-based datasource architecture. This makes it suitable not just for querying data, but for building automated pipelines, internal operations tools, and multi-step data processing flows.
The system evolved from a PostgreSQL admin tool into a full internal automation platform.
| Capability | Jet Admin | Traditional BI |
|---|---|---|
| Workflow Automation | β Node-based engine | β |
| Plugin Datasources | β 25+ connectors | |
| Internal Tool Builder | β Full widget system | β |
| Self-Hostable | β Docker ready | |
| AI Query Support | β Built-in | β |
| Multi-Tenancy | β RBAC + Isolation |
Jet Admin supports 25+ datasource connectors through its package-driven plugin system.
Databases
- PostgreSQL Β· MySQL Β· MongoDB Β· SQLite Β· SQL Server Β· Redis Β· Neo4j
Cloud & SaaS
- BigQuery Β· Supabase Β· Firestore Β· Elasticsearch Β· AWS S3
APIs & Services
- REST Β· GraphQL Β· Slack Β· Stripe Β· Twilio Β· SendGrid
A visual node-based execution system for building multi-step automation pipelines.
Start β Query β Transform β Condition β Loop β Notify β End
Capabilities:
- β Visual node-based builder
- β Conditional branching logic
- β Loop and iterator nodes
- β Context-driven data passing
- β Worker-based execution model
- β Scheduled and triggered workflows
- β Template variable resolution
Charts: Bar Β· Line Β· Pie Β· Radar Β· Scatter Β· Bubble
Data Widgets: Tables Β· Text Β· Markdown Β· HTML Β· Custom widgets
Features: Real-time updates Β· Export support Β· Responsive layouts Β· Drag-and-drop builder
Security: Multi-tenancy Β· RBAC Β· API authentication Β· Audit logs
User Management: Team invites Β· Permissions Β· Activity tracking
graph TD
A[Frontend - React] -->|REST / WebSocket| B[API Gateway]
B --> C[Module System]
C --> D[Workflow Engine]
D --> E[Execution Workers]
E --> F[Datasource Connectors]
F --> G[(PostgreSQL / External DBs)]
C --> H[Widget System]
C --> I[Dashboard Engine]
jet-admin/
βββ apps/
β βββ backend/ # Node.js API + Workflow Engine
β β βββ config/
β β βββ modules/
β β β βββ datasource/
β β β βββ dataQuery/
β β β βββ workflow/
β β β βββ widget/
β β β βββ dashboard/
β β βββ prisma/
β β βββ utils/
β βββ frontend/ # React Application
β βββ src/
β βββ data/ # API layer
β βββ logic/ # Hooks, state, contexts
β βββ presentation/ # UI components only
β
βββ packages/
β βββ datasource-types/ # Shared connector contracts
β βββ datasources-logic/ # Connector implementations
β βββ widgets/ # Widget definitions
β βββ workflow-nodes/ # Node type definitions
β
βββ docker-compose.cloud.yml
The backend uses a feature-module architecture instead of a traditional layered monolith. Each feature is fully isolated with its own controller, service, repository, and execution logic.
modules/
βββ datasource/
β βββ controller.js
β βββ service.js
β βββ repository.js
β βββ validation.js
βββ workflow/
β βββ controller.js
β βββ service.js
β βββ engine.js β Workflow execution
β βββ workers/ β Node workers
β βββ repository.js
βββ dataQuery/
βββ controller.js
βββ service.js
βββ repository.js
Design principles enforced:
- β Feature isolation β modules do not directly call each other
- β Low coupling β changes in one module don't cascade
- β High cohesion β all logic for a feature lives in its module
- β Clear ownership β every file has a single responsibility
sequenceDiagram
participant U as User
participant E as Workflow Engine
participant R as Input Resolver
participant W as Worker
participant C as Context Store
U->>E: Trigger Workflow
E->>E: Load Node Definitions
E->>R: Resolve Node Inputs
R->>C: Read Context Variables
C-->>R: Return Values
R-->>E: Resolved Inputs
E->>W: Execute Worker(node, inputs)
W-->>E: Return Result
E->>C: Write Output to Context
E->>E: Determine Next Nodes
E-->>U: Execution Complete
The workflow engine enforces a strict contract between nodes, workers, and the engine.
| A Node MUST define | A Node MUST NOT do |
|---|---|
| Input schema | Execute business logic |
| Output schema | Mutate workflow context |
| UI configuration | Access the database directly |
| Worker reference | Call other modules |
| Validation rules | Resolve template variables |
Execution happens exclusively in workers. Nodes are metadata only.
Workers are the execution units of the workflow engine. This separation is intentional β it enables testability, strategy replacement, and future distributed execution.
Node Definition
β
Input Resolution (Engine)
β
Worker Execution
β
Result Returned
β
Context Updated (Engine)
β
Next Nodes Triggered
Worker contract:
// β
Correct worker pattern
export async function executeQueryWorker(node, resolvedInputs, context) {
const result = await queryService.execute(
resolvedInputs.queryId,
resolvedInputs.params
);
return { result }; // Engine writes this to context
}Rules workers must follow:
- β Return deterministic output
- β Accept only pre-resolved inputs
- β Never mutate context directly
- β Never call the workflow engine
- β Never resolve template variables internally
Workflow execution is driven by a context-propagation model. Each node reads from context and writes results back through the engine.
Context structure:
context = {
input: { customerId: 15 },
node_query_1: { result: [...] },
node_filter_2: { filtered: [...] }
}Input types supported:
| Type | Example |
|---|---|
| Literal | "value": 25 |
| Template | "value": "{{node_query_1.result}}" |
Resolution flow:
Detect Input Type
β
If Literal β Return Value Immediately
β
If Template β Parse Reference
β
Resolve from Context
β
Return Evaluated Value to Worker
Resolution always happens before worker execution β never inside workers.
Extensibility is achieved through package-driven plugins. The core system never depends on plugin implementations. Plugins depend on core contracts.
packages/
βββ datasources-logic/
β βββ postgres/
β βββ mysql/
β βββ mongodb/
β βββ ...
βββ workflow-nodes/
β βββ queryNode/
β βββ transformNode/
β βββ conditionNode/
βββ widgets/
βββ barChart/
βββ table/
βββ ...
Every connector must implement the standard contract:
export class PostgresConnector {
async connect(config) { /* ... */ }
async disconnect() { /* ... */ }
async execute(query, params) { /* normalized result */ }
async validate(config) { /* boolean */ }
async healthCheck() { /* status */ }
}Rules:
- β Return normalized results only
- β Handle connection errors internally
- β No workflow logic inside connectors
- β No cross-connector dependencies
export const QueryNode = {
type: "dataQuery",
name: "Execute Query",
inputs: {
queryId: { type: "string", required: true },
params: { type: "object" }
},
outputs: {
result: "array"
},
worker: executeQueryWorker,
uiConfig: { /* form schema */ }
};export const BarChartWidget = {
type: "barChart",
configSchema: { /* JSON Schema */ },
render(data, config) {
return <BarChart data={data} options={config} />;
}
};{
"id": "sales-report-workflow",
"name": "Monthly Sales Report",
"nodes": [
{ "id": "start", "type": "start" },
{ "id": "query_1", "type": "dataQuery", "inputs": { "queryId": "fetchOrders" } },
{ "id": "filter_2", "type": "transform", "inputs": { "data": "{{query_1.result}}", "filter": "last30days" } },
{ "id": "agg_3", "type": "aggregate", "inputs": { "data": "{{filter_2.filtered}}", "field": "revenue" } },
{ "id": "chart_4", "type": "generateChart","inputs": { "data": "{{agg_3.aggregated}}", "type": "bar" } },
{ "id": "end", "type": "end" }
],
"edges": [
{ "from": "start", "to": "query_1" },
{ "from": "query_1", "to": "filter_2" },
{ "from": "filter_2","to": "agg_3" },
{ "from": "agg_3", "to": "chart_4" },
{ "from": "chart_4", "to": "end" }
]
}| Category | Convention | Example |
|---|---|---|
| Functions | camelCase | executeWorkflow() |
| Classes | PascalCase | WorkflowEngine |
| Constants | UPPER_SNAKE | MAX_RETRY_COUNT |
| Files | kebab-case | workflow-engine.js |
| DB columns | snake_case (Prisma mapped) | created_at |
// β
Good β single responsibility, deterministic
async function resolveNodeInputs(node, context) {
return node.inputs.map(input => resolveInput(input, context));
}
// β Bad β mixed responsibilities
async function resolveAndExecuteAndStore(node, context) {
const inputs = resolve(node, context); // resolution
const result = await execute(inputs); // execution
await db.save(result); // storage
}// β
Correct β errors propagate with context
try {
await executeWorker(node, resolvedInputs);
} catch (error) {
logger.error({ nodeId: node.id, executionId, error });
throw new WorkflowExecutionError(node.id, executionId, error.message);
}
// β Wrong β swallowed error, silent failure
try {
await executeWorker(node, resolvedInputs);
} catch (e) {
return null;
}Rule: Controller β Service β Repository β DB
Never: Controller β DB directly
Never: Worker β DB directly
Always: Use Prisma models only
Always: Wrap multi-step operations in transactions
Frontend is organized into three strict layers that must never be mixed:
src/
βββ data/ # API calls, models, transformers
βββ logic/ # Hooks, state, contexts
βββ presentation/ # UI components only (no logic)
// β
Correct β component uses hook, hook uses service
const WorkflowList = () => {
const { workflows } = useWorkflows(); // logic layer
return workflows.map(w => <WorkflowCard w={w} />);
};
// β Wrong β component calls API directly
const WorkflowList = () => {
const [workflows, setWorkflows] = useState([]);
useEffect(() => { axios.get('/api/workflows').then(setWorkflows); }, []);
};| Anti-Pattern | Why it's wrong |
|---|---|
| Business logic in controllers | Breaks testability and reuse |
| Workers mutating context | Breaks execution determinism |
| Nodes executing DB logic | Breaks separation of concerns |
| Template resolution inside workers | Engine responsibility, not worker |
| Cross-module direct calls | Creates hidden coupling |
| Hardcoded datasource logic | Breaks plugin isolation |
| Swallowing errors silently | Hides failures during debugging |
Jet Admin enforces these design decisions deliberately:
| Decision | Reason |
|---|---|
| Workers execute, nodes define | Enables strategy replacement and testability |
| Context is read-only for workers | Prevents hidden mutations and execution chaos |
| Inputs resolved before execution | Ensures predictable, deterministic worker behavior |
| Plugins are packages, not inline code | Enables independent versioning and release |
| Module isolation enforced | Prevents cascading failures and coupling |
These are not opinions β they are guarantees the system depends on.
The architecture is designed to support distributed execution in future iterations:
Current: Synchronous Worker Execution
Next: Queue-based worker dispatch (RabbitMQ ready)
Future: Distributed workflow runners + execution snapshots
Why workers remain pure matters: Stateless workers can be picked up by any runner β local, queued, or distributed β without code changes.
- Multi-tenancy: All queries scoped to tenant context
- RBAC: Role-based permissions enforced at service layer
- Credential storage: Datasource secrets encrypted at rest
- Query validation: All user-supplied queries validated before execution
- Execution sandboxing: Workers run in isolated execution contexts
- Node.js 18+
- PostgreSQL 14+
- Docker (recommended)
- Firebase project (for auth)
git clone https://github.com/Jet-labs/jet-admin.git
cd jet-admin
cp .env.docker.example .env.docker
docker-compose -f docker-compose.cloud.yml up -dAccess: http://localhost:3000
Backend:
cd apps/backend
npm install
cp .env.example .env
npx prisma migrate dev
npm run devFrontend:
cd apps/frontend
npm install
npm run devBackend .env:
DATABASE_URL=postgresql://user:pass@localhost:5432/jetadmin
JWT_SECRET=your_jwt_secret
FIREBASE_PROJECT_ID=your_project_id
RABBITMQ_URL=amqp://localhost
API_PORT=4000Frontend .env:
VITE_API_URL=http://localhost:4000
VITE_FIREBASE_CONFIG={"apiKey":"..."}feature/workflow-loop-node
fix/context-resolution-bug
docs/plugin-development-guide
refactor/worker-execution-pattern
feat: add loop execution node
fix: resolve context variable mutation issue
docs: add datasource connector guide
refactor: extract worker strategy pattern
test: add workflow engine unit tests
- No business logic in controllers
- Workers don't mutate context
- Modules remain isolated
- Error handling follows established pattern
- No console.log statements
- No unused imports
- Code follows naming conventions
- Tests added for new features
1. Fork the repository
2. Create a feature branch
3. Implement changes following coding standards
4. Write/update tests
5. Open a PR with description of changes
6. Address review feedback
| Document | Description |
|---|---|
| Workflow Engine | Execution lifecycle, context model, design decisions |
| Node Spec | Node definition contract, worker patterns |
| Plugin Development | Building datasource connectors and widget plugins |
| Execution Context | Context model, template resolution, immutability rules |
| Worker Design | Worker architecture, strategy patterns, scaling |
| Contribution Rules | Coding standards, PR process, anti-patterns |
- Workflow versioning and rollback
- Visual execution debugger
- Distributed worker execution
- AI-powered workflow builder
- Plugin marketplace
- Real-time execution tracing
- Partial workflow resume
- Execution snapshots
| Layer | Technology |
|---|---|
| Frontend | React 18, Tailwind CSS, React Flow, React Query |
| Backend | Node.js, Express, Prisma ORM |
| Database | PostgreSQL |
| Auth | Firebase |
| Messaging | RabbitMQ |
| Charts | Chart.js |
| Realtime | Socket.IO |
| Infra | Docker, Linux |
MIT License β see LICENSE for details.
Built with: React Β· Node.js Β· Prisma Β· React Flow Β· Chart.js Β· Socket.IO Β· PostgreSQL
β Star this repo if it helps you