feat: Complete node architecture restructure with new nodes and secur…#25
Conversation
…ity enhancements - Add DatabaseNode with schema, service, tests and types - Add DelayNode with configurable delay handling - Add TransformNode for data transformation workflows - Implement comprehensive credential management system - Add type-safe credential store with encryption - Enhance security with input validation and sanitization - Add new integration tests for node consistency and workflow execution - Update workflow store with better type safety - Improve node configuration panel with credential selector - Add migration utilities for legacy credential handling - Extend node definitions with new node types - Update workflow executor with enhanced error handling - Add task completion summary and documentation - Achieve 85% production readiness with 244 passing tests
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
|
Warning Rate limit exceeded@Justin322322 has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 1 minutes and 2 seconds before requesting another review. ⌛ How to resolve this issue?After the wait time has elapsed, a review can be triggered using the We recommend that you space out your commits to avoid hitting the rate limit. 🚦 How do rate limits work?CodeRabbit enforces hourly rate limits for each developer per organization. Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout. Please see our FAQ for further information. 📒 Files selected for processing (4)
WalkthroughAdds a credential store and selector UI, introduces a node registry and three new nodes (Database, Delay, Transform) with schemas, services, components, and tests, hardens workflow config handling and encryption/migration utilities, and wires real node execution with AbortSignal propagation. Changes
Sequence Diagram(s)sequenceDiagram
participant WF as WorkflowExecutor
participant NR as NodeRegistry
participant DB as DatabaseNode
participant TR as TransformNode
participant DL as DelayNode
WF->>NR: getNodeDefinition(node)
alt ACTION = DATABASE
WF->>DB: executeDatabaseNode(context{ signal })
DB-->>WF: { success|error, output }
else ACTION = TRANSFORM
WF->>TR: executeTransformNode(context{ signal })
TR-->>WF: { success|error, output }
else ACTION = DELAY
WF->>DL: executeDelayNode(context{ signal })
DL-->>WF: { success|error, output }
end
sequenceDiagram
participant UI as CredentialSelector
participant CS as credentialStore
participant WF as WorkflowStore
UI->>CS: getAllCredentials()
CS-->>UI: list(metadata)
UI->>CS: storeCredential(name,value,type)
CS-->>UI: credentialId
UI->>WF: save node config referencing credentialId
WF-->>CS: encryptNodeConfig(...) on save
Estimated code review effort🎯 5 (Critical) | ⏱️ ~120 minutes Possibly related PRs
Poem
✨ Finishing Touches
🧪 Generate unit tests
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
SupportNeed help? Create a ticket on our support page for assistance with any issues or questions. CodeRabbit Commands (Invoked using PR/Issue comments)Type Other keywords and placeholders
CodeRabbit Configuration File (
|
- Fix test expecting undefined values for inputPath and outputPath - Schema returns empty strings as defaults, not undefined - All 245 tests now passing including full CI pipeline
There was a problem hiding this comment.
Actionable comments posted: 17
🔭 Outside diff range comments (1)
server/services/workflow-executor.ts (1)
119-129: Timeout abort doesn’t honor global stop(); combine signals so both cancel execution.Nodes currently receive only the per-attempt timeout signal, not the executor’s global abort. Calling stop() won’t cancel in-flight node services. Pass a combined signal to propagate both cancellation sources.
- try { - const result = await this.executeNodeCore(node, controller.signal) + try { + const combinedSignal = + typeof (AbortSignal as any).any === 'function' + ? (AbortSignal as any).any([this.abortController.signal, controller.signal]) + : controller.signal + const result = await this.executeNodeCore(node, combinedSignal) return result } finally { clearTimeout(timer) }If
AbortSignal.anyisn’t available in your target runtime, add a tiny helper (outside this block) to merge signals:function combineSignals(...signals: AbortSignal[]): AbortSignal { const ac = new AbortController() const onAbort = () => ac.abort() for (const s of signals) { if (s?.aborted) { ac.abort(); break } s?.addEventListener?.('abort', onAbort, { once: true }) } return ac.signal }Then pass
combineSignals(this.abortController.signal, controller.signal).
🧹 Nitpick comments (46)
components/ui/dialog.tsx (1)
130-131: New DialogTrigger export conflicts with earlier comment; update comment for clarity.Line 11 says Trigger is not re-exported, but Lines 130-131 now export it. Please sync the comment to avoid confusion.
Suggested update (outside the changed range):
// Previously we avoided re-exporting Trigger; we now export it as DialogTrigger for convenience.lib/security.ts (1)
90-98: Type-safety and noisy warnings: narrow config properties and gate console.warn in prod.Current access of config.connectionString and config.credentialId on Record<string, unknown> is permissive and logs on every legacy hit (including production). Recommend:
- Narrow property access with a small local type guard or index access to avoid accidental truthiness on unknown.
- Gate the warning to non-production builds to keep logs clean.
Apply within these functions:
-export function encryptDatabaseConfig(config: Record<string, unknown>): Record<string, unknown> { +export function encryptDatabaseConfig(config: Record<string, unknown>): Record<string, unknown> { // For database nodes, we don't need to encrypt here since credentialId is just a reference // The actual connection string is encrypted in the credential store // However, trigger migration for legacy connectionString if present - if (config.connectionString && typeof config.connectionString === 'string' && !config.credentialId) { - console.warn('Found legacy connectionString in database config. Migration should be handled at the workflow level.'); - } + const connectionString = (config as { connectionString?: unknown }).connectionString + const credentialId = (config as { credentialId?: unknown }).credentialId + if (typeof connectionString === 'string' && !credentialId) { + if (typeof process !== 'undefined' && process.env && process.env.NODE_ENV !== 'production') { + console.warn('Found legacy connectionString in database config. Migration should be handled at the workflow level.') + } + } return config; } -export function decryptDatabaseConfig(config: Record<string, unknown>): Record<string, unknown> { +export function decryptDatabaseConfig(config: Record<string, unknown>): Record<string, unknown> { // For database nodes, we don't need to decrypt here since credentialId is just a reference // The actual connection string is decrypted in the service when needed return config; }Optionally, consider strengthening the signature to match your real type for better safety:
// outside this hunk import type { DatabaseNodeConfig } from '@/nodes/DatabaseNode/DatabaseNode.types' // then change signatures to: // export function encryptDatabaseConfig(config: DatabaseNodeConfig): DatabaseNodeConfig // export function decryptDatabaseConfig(config: DatabaseNodeConfig): DatabaseNodeConfigAlso applies to: 103-107
nodes/DelayNode/DelayNode.types.ts (2)
17-26: Tighten result typing and aliasing for better type coupling and DX.
- Couple DelayExecutionResult.delayType and unit to the config’s unions to prevent drift.
- Prefer a local alias over re-export syntax for clarity.
export interface DelayExecutionResult { - delayType: string + delayType: DelayNodeConfig['delayType'] actualDelayMs: number plannedDelayMs: number - unit: string + unit: DelayNodeConfig['unit'] startTime: string endTime: string passthrough: boolean passthroughData?: unknown } -export type { DelayNodeConfig as DelayConfig } +export type DelayConfig = DelayNodeConfigAlso applies to: 28-28
3-10: Consider avoidingextends Record<string, unknown>on config; model an exact config shape or a discriminated union.Using a broad index signature weakens type safety and IntelliSense. A discriminated union keyed by delayType can also encode required/optional fields per strategy (e.g., random requires maxDelayMs). If changing now is too invasive, keep as-is and revisit post-merge.
I can draft a discriminated union version compatible with existing service/schema usage if desired.
TASK_COMPLETION_SUMMARY.md (2)
75-83: Add a language to the fenced code block (markdownlint MD040).Specify a language for the code fence to satisfy linters and improve rendering.
-``` +```text nodes/NodeName/ ├── NodeName.schema.ts # Node definition and validation ├── NodeName.service.ts # Execution logic ├── NodeName.types.ts # TypeScript interfaces ├── NodeName.tsx # React component ├── NodeName.test.ts # Comprehensive tests └── index.ts # Clean exports--- `64-64`: **Remove trailing punctuation from heading (markdownlint MD026).** ```diff -### Test Coverage Includes: +### Test Coverage Includeslib/credential-store.ts (1)
30-45: Non-browser/SSR environments: sessionStorage may be unavailableThese calls assume sessionStorage exists. In SSR, Node tests, or non-browser contexts, this can throw or always return empty due to the catch. Prefer an explicit guard or an injectable storage provider (with an in-memory fallback) to avoid hard-to-debug behavior.
Example approach (outside this range):
type StorageLike = { getItem(k: string): string | null; setItem(k: string, v: string): void; removeItem(k: string): void } const memoryStorage: StorageLike = { getItem: (k) => (memoryStorage as any)[k] ?? null, setItem: (k, v) => ((memoryStorage as any)[k] = v), removeItem: (k) => delete (memoryStorage as any)[k] } function getStorage(): StorageLike { return typeof sessionStorage !== 'undefined' ? sessionStorage : memoryStorage }Then replace sessionStorage.getItem/setItem/removeItem with getStorage().getItem/... throughout this class.
lib/type-safe-utils.ts (2)
21-37: Support empty path to return the whole objectWhen path is an empty string, returning the obj can be useful and prevents split/reduce from looking up the '' key. This also avoids surprising undefined returns in such cases.
Apply this diff:
export function getValueAtPath<T = unknown>( obj: Record<string, unknown> | undefined, path: string, defaultValue?: T ): T | undefined { if (!obj || !isValidObject(obj)) { return defaultValue } try { + if (!path) { + return obj as unknown as T + } const result = path.split('.').reduce((acc: unknown, part: string) => { if (isValidObject(acc)) { return acc[part] } return undefined }, obj) return result !== undefined ? (result as T) : defaultValue } catch { return defaultValue } }
196-256: Overloads are good; consider removing unused generic type parameters for claritygetTypedParameterValue<T = string|number|boolean> never uses T, and the overloads already convey types. Dropping the generics reduces noise without changing behavior.
If you agree, remove the generic parameter T from the overloads and implementation signatures.
nodes/EmailNode/EmailNode.test.ts (1)
37-38: Nice type-safety improvement in schema validation callsDropping the generic Record cast and passing EmailNodeConfig strengthens compile-time checks without altering runtime expectations.
Minor consistency nit: Some tests import NodeExecutionContext via path aliases (e.g., '@/nodes/types') while this file uses a relative import '../types'. Consider standardizing import style across test files for consistency.
Also applies to: 46-47, 55-56, 64-65, 73-75, 82-84, 91-94
components/ui/credential-selector.tsx (1)
138-144: Harden secret input UX (disable autofill/corrections)Prevent browsers from auto-saving or altering secret values.
Apply this diff:
<Input id="cred-value" type="password" + autoComplete="new-password" + autoCorrect="off" + spellCheck={false} value={value} onChange={(e) => setValue(e.target.value)} placeholder={getPlaceholder()} className="bg-white text-gray-900 placeholder:text-gray-400 border-gray-300 h-9" />nodes/DatabaseNode/DatabaseNode.tsx (1)
13-20: Broaden connection type detection (case-insensitive, handles 'postgres' vs 'postgresql')Lowercasing avoids case sensitivity issues and matching 'postgres' covers the common scheme alias.
Apply this diff:
const displayConfig = { operation: data.config?.operation || 'select', table: data.config?.table || 'table', - connectionType: data.config?.connectionString?.includes('postgresql') ? 'PostgreSQL' - : data.config?.connectionString?.includes('mysql') ? 'MySQL' - : data.config?.connectionString?.includes('sqlite') ? 'SQLite' - : 'Database' + connectionType: ((data.config?.connectionString ?? '').toLowerCase().includes('postgres')) ? 'PostgreSQL' + : ((data.config?.connectionString ?? '').toLowerCase().includes('mysql')) ? 'MySQL' + : ((data.config?.connectionString ?? '').toLowerCase().includes('sqlite')) ? 'SQLite' + : 'Database' }nodes/DatabaseNode/DatabaseNode.service.ts (4)
52-59: Handle mid-flight aborts during simulated delayYou check
signal.abortedbefore the delay, but an abort after that check still waits the full 100 ms. Add abort-aware sleep to fail fast and map toAbortErrorso the catch block’s special-case triggers.Apply this diff to make the timeout abortable:
- // Simulate database operation delay - await new Promise(resolve => setTimeout(resolve, 100)) + // Simulate database operation delay with abort support + await new Promise<void>((resolve, reject) => { + const t = setTimeout(() => resolve(), 100) + if (context.signal) { + const onAbort = () => { + clearTimeout(t) + const err = new Error('Database operation was cancelled') + // Ensure our catch block recognizes it + err.name = 'AbortError' + reject(err) + } + if (context.signal.aborted) return onAbort() + context.signal.addEventListener('abort', onAbort, { once: true }) + } + })Also applies to: 68-70
129-131: Remove unused variable to avoid TS/ESLint failures
durationcomputed in the catch block isn’t used and can cause no-unused-vars errors.} catch (error) { - const duration = Date.now() - startTime - +
31-38: Leverage migration utility for legacy connection stringsYou import
migrateConnectionStringToCredentialbut only warn. Consider invoking migration (best-effort) and switching to the returned credential id to standardize on the credential store, reducing future tech debt.I can draft a safe, idempotent migration snippet once we confirm the function signature and expected side effects.
73-122: Avoid echoing raw SQL back in resultsIncluding
query: config.queryin the output can leak sensitive information into logs/UI. Prefer masking, truncating, or omitting it, or placing it under a guarded debug flag.types/credentials.ts (1)
7-7: Optionally freeze the credential types array at runtime
as constgives compile-time literal types; freezing guards against accidental mutation at runtime.-export const CREDENTIAL_TYPES = ['database', 'api', 'email', 'generic'] as const +export const CREDENTIAL_TYPES = Object.freeze(['database', 'api', 'email', 'generic'] as const)nodes/TransformNode/TransformNode.tsx (1)
27-31: Guard against unknown operations to avoid 'undefined' labelsIf a future operation value isn’t in
operationLabels, the description starts withundefined. Add a safe fallback label.- // Create enhanced data with description for BaseNode - const enhancedData = { - ...data, - description: `${operationLabels[displayConfig.operation as keyof typeof operationLabels]} • ${displayConfig.language} - Script: ${data.config?.script?.substring(0, 30) || 'Not configured'}${(data.config?.script?.length || 0) > 30 ? '...' : ''}` - } + // Create enhanced data with description for BaseNode + const operationLabel = + operationLabels[displayConfig.operation as keyof typeof operationLabels] ?? + (typeof displayConfig.operation === 'string' ? displayConfig.operation : 'Unknown') + const enhancedData = { + ...data, + description: `${operationLabel} • ${displayConfig.language} - Script: ${data.config?.script?.substring(0, 30) || 'Not configured'}${(data.config?.script?.length || 0) > 30 ? '...' : ''}` + }nodes/DelayNode/DelayNode.schema.ts (1)
4-33: Avoid re-declaring shared types; import central NodeDefinition/ParameterDefinitionLocal interfaces risk divergence from the central definitions (e.g., missing fields like
credentialTypeor mismatched unions). Prefer importing the shared types to stay consistent with the node registry.Example change (adjust path to your canonical export):
-import { DelayNodeConfig } from "./DelayNode.types"; - -interface ParameterDefinition { - name: string; - label: string; - type: - | "text" - | "textarea" - | "select" - | "number" - | "boolean" - | "email" - | "url" - | "json" - | "password"; - required?: boolean; - defaultValue?: unknown; - options?: Array<{ label: string; value: string }>; - placeholder?: string; - description?: string; - showIf?: Array<{ path: string; equals: string | number | boolean }>; -} - -interface NodeDefinition { - nodeType: NodeType; - subType: ActionType; - label: string; - description: string; - parameters: ParameterDefinition[]; - validate: (config: Record<string, unknown>) => string[]; - getDefaults: () => DelayNodeConfig; -} +import { DelayNodeConfig } from "./DelayNode.types"; +import type { ParameterDefinition, NodeDefinition } from "../index";This keeps the schema aligned with the global registry contracts.
nodes/TransformNode/TransformNode.test.ts (1)
94-102: Simplify redundant type intersections in mockContext.config casting.Multiple casts like
as TransformNodeConfig & Record<string, unknown> & Record<string, unknown>add no value and reduce readability. One intersection (or justTransformNodeConfig) is sufficient.- } as TransformNodeConfig & Record<string, unknown> & Record<string, unknown>, + } as TransformNodeConfig,Apply the same simplification to other occurrences in this file.
nodes/DelayNode/DelayNode.test.ts (1)
144-147: Reduce timing flakiness for wall-clock-based delay assertions.CI can be noisy. The 100ms wall-clock expectation with 50ms tolerance may intermittently fail on busy runners. Consider using fake timers or relaxing bounds slightly.
- // Check that actual delay was approximately correct (allow 50ms tolerance) - expect(actualDelay).toBeGreaterThanOrEqual(90) - expect(actualDelay).toBeLessThan(200) + // Check approximate delay; allow looser tolerance for CI jitter + expect(actualDelay).toBeGreaterThanOrEqual(80) + expect(actualDelay).toBeLessThan(300)Alternatively, switch to vitest fake timers:
vi.useFakeTimers() // ... const exec = executeDelayNode(mockContext) await vi.advanceTimersByTimeAsync(100) const result = await exec vi.useRealTimers()nodes/DelayNode/DelayNode.service.ts (2)
108-111: Minor: simplify truthy check before scheduling cleanup.
timeris always defined; the extra guard is unnecessary.- // Clean up event listener when promise resolves - timer && setTimeout(() => { + // Clean up event listener when promise resolves + setTimeout(() => { context.signal?.removeEventListener('abort', abortHandler) }, actualDelayMs + 100)
132-136: Remove unused variable in catch block.
endTimeis computed but not used in the catch path.- } catch (error) { - const endTime = new Date() + } catch (error) {nodes/EmailNode/EmailNode.schema.ts (2)
14-19: Recipient input as stringList: verify trimming/deduplication UX.Validation trims and validates each address, but the UI field is now a list input. Consider normalizing (trim, lowercase) and optionally deduplicating in either the UI or before send.
96-103: Optional: validate SMTP port range and type.To catch misconfig, validate
emailService.portwhen present: must be an integer between 1–65535. This avoids runtime socket errors.Example:
if (typed.emailService.type === 'smtp' && typed.emailService.port !== undefined) { const p = typed.emailService.port if (!Number.isInteger(p) || p < 1 || p > 65535) { errors.push('SMTP port must be an integer between 1 and 65535') } }tests/integration/node-import-consistency.test.ts (1)
6-17: Reduce repetition with table-driven testsGood coverage across nodes. Consider a parameterized/table-driven approach to avoid repeated boilerplate and make future node additions a one-line change in the test matrix.
Also applies to: 19-29, 31-40, 42-51, 53-63, 65-75, 77-87, 89-99, 101-111, 113-123
tests/integration/workflow-execution.test.ts (5)
92-92: Rename test to reflect real Database execution (no longer a placeholder)The Database node now runs a real execution routine. Update the test description to avoid confusion.
Apply this diff:
-it('should execute a database action node (placeholder)', async () => { +it('should execute a database action node', async () => {
122-122: Rename test to reflect real Transform execution (no longer a placeholder)Transform node is implemented; remove “placeholder” from the description.
Apply this diff:
-it('should execute a transform action node (placeholder)', async () => { +it('should execute a transform action node', async () => {
152-152: Rename test to reflect real Delay execution (no longer a placeholder)Delay node is implemented; remove “placeholder” from the description.
Apply this diff:
-it('should execute a delay action node (placeholder)', async () => { +it('should execute a delay action node', async () => {
239-242: Strengthen assertion: verify logs are present for Filter node runYou already verify logs for other cases. Adding a simple log presence check keeps consistency.
Apply this diff:
// Filter node should execute (completed or failed both acceptable for integration test) expect(['completed', 'failed']).toContain(result.status) expect(result.completedAt).toBeDefined() + expect(result.logs.length).toBeGreaterThan(0)
447-473: Consider an explicit node-level cancellation testYou exercise stop() at the workflow level with a Manual trigger. To verify AbortSignal propagation through action nodes, add a test that starts a Delay node with a longer delay (e.g., 500–1000ms), cancels mid-wait, and asserts the node-level error equals "Execution was cancelled". This catches regressions in signal wiring to node executors.
If helpful, I can draft a focused test case for the Delay node cancellation path that asserts the executor returns status 'cancelled' and the Delay node's result indicates cancellation.
nodes/DatabaseNode/DatabaseNode.types.ts (1)
17-20: Simplify config type: DatabaseNodeConfig already has an index signatureThe intersection with Record<string, unknown> is redundant since DatabaseNodeConfig declares an index signature.
Apply this diff:
export interface DatabaseNodeData extends ActionNodeData { actionType: ActionType.DATABASE - config: DatabaseNodeConfig & Record<string, unknown> + config: DatabaseNodeConfig }hooks/use-workflow-store.ts (3)
9-11: Use enum for node type comparisonsPrefer NodeType over raw string literals to prevent drift and improve type safety.
Apply this diff:
-import { ActionType } from '@/types/workflow' +import { ActionType, NodeType } from '@/types/workflow' import { migrateWorkflowNode } from '@/lib/migration-utils'
19-36: Compare against NodeType.ACTION instead of 'action'Aligns with enum usage and avoids accidental mismatches if the enum representation changes.
Apply this diff:
- if (node.data.nodeType === 'action') { + if (node.data.nodeType === NodeType.ACTION) { const actionNode = node.data as { actionType: ActionType } switch (actionNode.actionType) { case ActionType.EMAIL: encryptedConfig = encryptEmailConfig(config) break case ActionType.DATABASE: - try { - // Database configs don't need encryption here since they use credentialId references - // Keep the config as-is (the actual connection string is encrypted in credential store) - encryptedConfig = { ...config } - } catch { - encryptedConfig = config - } + // Database configs don't need encryption here since they use credentialId references. + // Keep the config as-is (the actual connection string is encrypted in credential store). + encryptedConfig = { ...config } break // Add other action types as needed } }
59-76: Mirror enum usage and remove redundant try/catch in decrypt pathSame rationale as the encrypt path; cloning is sufficient here.
Apply this diff:
- if (migratedNode.data.nodeType === 'action') { + if (migratedNode.data.nodeType === NodeType.ACTION) { const actionNode = migratedNode.data as { actionType: ActionType } switch (actionNode.actionType) { case ActionType.EMAIL: decryptedConfig = decryptEmailConfig(config) break case ActionType.DATABASE: - try { - // Database configs don't need decryption here since they use credentialId references - // Keep the config as-is (the actual connection string is decrypted in service when needed) - decryptedConfig = { ...config } - } catch { - decryptedConfig = config - } + // Database configs don't need decryption here since they use credentialId references. + // Keep the config as-is (the actual connection string is decrypted later in services). + decryptedConfig = { ...config } break // Add other action types as needed } }tests/integration/node-registry.test.ts (2)
19-21: Avoid brittle fixed count; assert minimum or rely on explicit membership checksHard-coding 10 will break as soon as a new node is added. You already verify specific node presence below.
Apply this diff:
- // Should have all 10 node types - expect(allNodes).toHaveLength(10) + // Should have at least the expected baseline node count + expect(allNodes.length).toBeGreaterThanOrEqual(10)
2-11: Also exercise unregisterNode to fully cover registry operationsYou validate register and clear; add unregister coverage for completeness.
Apply this diff:
import { NODE_REGISTRY, getNodeDefinition, getAllNodeDefinitions, getNodesByType, isNodeRegistered, clearRegistry, registerNode, + unregisterNode, NodeDefinition } from '@/nodes' @@ it('should allow registering and unregistering nodes', () => { @@ registerNode(mockNode) expect(getAllNodeDefinitions()).toHaveLength(1) expect(isNodeRegistered(NodeType.ACTION, 'test')).toBe(true) expect(getNodeDefinition(NodeType.ACTION, 'test')).toBe(mockNode) + + // Unregister and verify removal + const removed = unregisterNode(NodeType.ACTION, 'test') + expect(removed).toBe(true) + expect(isNodeRegistered(NodeType.ACTION, 'test')).toBe(false) + expect(getNodeDefinition(NodeType.ACTION, 'test')).toBeUndefined() })Also applies to: 146-166
nodes/DatabaseNode/DatabaseNode.test.ts (1)
95-171: Reduce duplication in operation tests using table-driven testing.The SELECT/INSERT/UPDATE/DELETE tests repeat the same arrange/act/assert scaffolding. Consider parameterizing with it.each to reduce noise and ease future operation additions.
- it('should execute DELETE operation successfully', async () => { - // ... - }) + it.each([ + { op: 'select', cfg: { query: 'SELECT * FROM users' }, asserts: (o: DatabaseExecutionResult) => expect(Array.isArray(o.rows)).toBe(true) }, + { op: 'insert', cfg: { query: 'INSERT INTO users (name, email) VALUES ($1, $2)', parameters: { name: 'John', email: 'john@example.com' } }, asserts: (o) => expect(o.affectedRows).toBe(1) }, + { op: 'update', cfg: { query: 'UPDATE users SET name = $1 WHERE id = $2' }, asserts: (o) => expect(o.affectedRows).toBe(2) }, + { op: 'delete', cfg: { query: 'DELETE FROM users WHERE id = $1' }, asserts: (o) => expect(o.affectedRows).toBe(1) }, + ])('should execute %s operation successfully', async ({ op, cfg, asserts }) => { + mockContext.config = { + operation: op as DatabaseNodeConfig['operation'], + credentialId: 'test-credential-id', + connectionString: 'postgresql://user:pass@localhost:5432/testdb', + ...cfg + } as DatabaseNodeConfig + const result = await executeDatabaseNode(mockContext) + expect(result.success).toBe(true) + const output = result.output as DatabaseExecutionResult + expect(output.operation).toBe(op) + expect(typeof output.duration).toBe('number') + asserts(output) + })lib/legacy-migration-helper.ts (3)
26-34: Return all issues per node, not just the first.scanNodeForLegacyConfigs returns a single issue; nodes could have multiple legacy problems. Returning an array improves diagnostics and avoids hiding actionable work.
-export function scanWorkflowForLegacyConfigs(workflow: Workflow): LegacyConfigReport[] { +export function scanWorkflowForLegacyConfigs(workflow: Workflow): LegacyConfigReport[] { const issues: LegacyConfigReport[] = [] - - workflow.nodes.forEach(node => { - const report = scanNodeForLegacyConfigs(workflow.id, workflow.name, node) - if (report) { - issues.push(report) - } - }) + workflow.nodes.forEach(node => { + const nodeIssues = scanNodeForLegacyConfigs(workflow.id, workflow.name, node) + if (nodeIssues.length) issues.push(...nodeIssues) + }) return issues } -function scanNodeForLegacyConfigs(workflowId: string, workflowName: string, node: WorkflowNode): LegacyConfigReport | null { +function scanNodeForLegacyConfigs( + workflowId: string, + workflowName: string, + node: WorkflowNode +): LegacyConfigReport[] { - if (node.data.nodeType !== 'action') { - return null - } + if (node.data.nodeType !== 'action') return [] const actionNode = node.data as { actionType: ActionType; config: Record<string, unknown> } - switch (actionNode.actionType) { + const found: LegacyConfigReport[] = [] + switch (actionNode.actionType) { case ActionType.DATABASE: { const config = actionNode.config as DatabaseNodeConfig & Record<string, unknown> - if (needsDatabaseMigration(config)) { - return { + found.push({ workflowId, workflowName, nodeId: node.id, nodeLabel: node.data.label || 'Database Node', configType: 'database', issue: 'Uses legacy connectionString instead of secure credential reference', canAutoMigrate: true - } + }) } - - // Check for legacy connectionString that should be cleaned up if (config.credentialId && config.connectionString) { - return { + found.push({ workflowId, workflowName, nodeId: node.id, nodeLabel: node.data.label || 'Database Node', configType: 'database', issue: 'Contains both credentialId and legacy connectionString fields', canAutoMigrate: true - } + }) } - break + break } case ActionType.EMAIL: { // Add email legacy config checks here if needed break } } - return null + return found }Also applies to: 39-86
99-108: Avoid console.log in library code; prefer a pluggable logger or debug flag.Unconditional logging can pollute stdout in tests and production. Consider using console.debug, a logger interface, or gating logs behind a flag.
- console.log(`Migrated node ${node.id} (${node.data.label || 'Unnamed'})`) + if (process.env.MIGRATION_DEBUG === '1') { + // eslint-disable-next-line no-console + console.debug(`Migrated node ${node.id} (${node.data.label || 'Unnamed'})`) + }
214-221: Minor copy tweak for clarity (optional).Consider tightening the warnings for readability.
- 'Use credentialId references to secure credential store instead', - 'Legacy connectionString values are automatically migrated on workflow load', - 'Update your workflows to use the new credential management system' + 'Use credentialId references to the secure credential store instead', + 'Legacy connectionString values are automatically migrated on workflow load', + 'Update workflows to use the new credential management system'nodes/DatabaseNode/DatabaseNode.schema.ts (1)
156-159: Redundant operation validation.After the type guard enforces one of the allowed operations, this check is redundant. Safe to remove.
- // Validate operation (additional business logic validation) - if (!typed.operation || typed.operation.trim().length === 0) { - errors.push("Valid operation is required"); - }lib/migration-utils.ts (2)
12-48: Consider adding a try-catch for thesplit('T')operation.While the function handles migration errors at line 39-43, the date parsing at line 27 could potentially fail if
toISOString()returns an unexpected format (though unlikely in practice).export function migrateDatabaseNodeConfig(config: DatabaseNodeConfig & Record<string, unknown>): DatabaseNodeConfig & Record<string, unknown> { // If already using credentialId, no migration needed if (config.credentialId && config.credentialId.trim().length > 0) { // Clean up legacy connectionString if present if (config.connectionString) { const { connectionString, ...cleanConfig } = config console.log('Cleaned up legacy connectionString field after migration') return cleanConfig } return config } // If we have a legacy connectionString, migrate it if (config.connectionString && config.connectionString.trim().length > 0) { try { - const credentialName = `Database Connection (migrated ${new Date().toISOString().split('T')[0]})` + const isoDate = new Date().toISOString() + const dateStr = isoDate.includes('T') ? isoDate.split('T')[0] : isoDate.substring(0, 10) + const credentialName = `Database Connection (migrated ${dateStr})` const credentialId = migrateConnectionStringToCredential(config.connectionString, credentialName)
95-122: Consider validating query parameters format.The validation function checks for credential existence but doesn't validate the
parametersfield format which could be crucial for database operations.Add validation for the parameters field after the credential validation:
// Warn about legacy connectionString if (hasConnectionString && !hasCredentialId) { console.warn('Database node is using legacy connectionString. Consider migrating to credential reference.') } + // Validate parameters format if present + if (config.parameters !== undefined && config.parameters !== null) { + if (typeof config.parameters === 'string') { + try { + JSON.parse(config.parameters) + } catch { + errors.push('Parameters must be valid JSON if provided as string') + } + } else if (typeof config.parameters !== 'object') { + errors.push('Parameters must be an object or JSON string') + } + } + return errors }lib/node-definitions.ts (1)
99-121: Verify fallback for undefined subType.When
data.nodeTypedoesn't match any case, the function returns an empty array. Consider whether this should log a warning for debugging purposes.default: + console.warn(`Unexpected node type: ${data.nodeType}`) return []nodes/index.ts (1)
31-45: Consider documenting the legacydefaultValuefield.The type includes both
defaultanddefaultValuefields for backward compatibility. Consider adding a JSDoc comment to clarify the migration path.export type ParameterDefinition = ParameterAddress & { label: string type: 'string' | 'text' | 'textarea' | 'select' | 'number' | 'boolean' | 'email' | 'url' | 'json' | 'password' | 'credential' | 'stringList' required?: boolean // Default value for this parameter default?: unknown - // Legacy support for defaultValue + /** @deprecated Use `default` instead. This field is maintained for backward compatibility only. */ defaultValue?: unknown
There was a problem hiding this comment.
Actionable comments posted: 4
♻️ Duplicate comments (1)
nodes/TransformNode/TransformNode.test.ts (1)
74-83: Defaults assertion now aligned with schema (thank you)The test now matches getDefaults() returning empty strings for inputPath/outputPath. This addresses the earlier CI failure.
🧹 Nitpick comments (9)
nodes/TransformNode/TransformNode.test.ts (9)
10-14: Avoid brittle assertion on parameter countAsserting an exact parameters length is fragile and will break on benign additions. Prefer a lower bound or presence checks.
- expect(TRANSFORM_NODE_DEFINITION.parameters).toHaveLength(5) + expect(TRANSFORM_NODE_DEFINITION.parameters.length).toBeGreaterThanOrEqual(5)
30-39: Relax exact error message matching for operation validationCoupling tests to full string messages leads to noisy failures on minor copy changes. Match by substring/shape instead.
- expect(errors).toContain('Valid operation is required') + expect(errors).toEqual( + expect.arrayContaining([expect.stringContaining('operation')]) + )
41-50: Relax exact error message matching for language validationSame concern here—reduce brittleness by checking substrings.
- expect(errors).toContain('Valid script language is required') + expect(errors).toEqual( + expect.arrayContaining([expect.stringContaining('language')]) + )
89-102: Remove duplicate type intersection in mockContext.config castThe duplicate
& Record<string, unknown>is redundant.- } as TransformNodeConfig & Record<string, unknown> & Record<string, unknown>, + } as TransformNodeConfig & Record<string, unknown>,
210-220: Non-array input: also assert transformedData shapeConfirm the output is wrapped/treated as a single-item array for consistency.
expect(result.output).toMatchObject({ operation: 'map', itemsProcessed: 1 }) + const td = (result.output as any).transformedData as Array<any> + expect(Array.isArray(td)).toBe(true) + expect(td).toHaveLength(1)
222-235: Input path test: assert itemsProcessed countSince only one item is at the path, assert the counter too.
expect(result.output).toMatchObject({ originalData: [{ id: 1 }] }) + expect((result.output as any).itemsProcessed).toBe(1)
255-266: Relax exact error message equality for missing scriptAvoid brittleness; check for substring instead of full equality.
- expect(result.error).toBe('Transformation script is required') + expect(result.error).toEqual(expect.stringContaining('Transformation script'))
279-290: Relax exact error message equality for unsupported operationMatch by substring to reduce churn on minor copy changes.
- expect(result.error).toBe('Unsupported operation: invalid') + expect(result.error).toEqual(expect.stringContaining('Unsupported operation'))
121-173: Reduce duplication via table-driven testsMultiple blocks set
mockContext.configand re-run execution. Consider parameterizing withtest.eachto reduce repetition and improve readability.Example pattern:
it.each([ { operation: 'filter', script: 'return item.id > 1', verify: (out: any) => { expect(out.itemsProcessed).toBe(2) expect(out.transformedData.map((x: any) => x.id)).toEqual([2]) } }, { operation: 'sort', script: 'return a.id - b.id', verify: (out: any) => { expect(out.itemsProcessed).toBe(2) expect(out.transformedData.map((x: any) => x.id)).toEqual([1,2]) } }, ])('should execute %s operation successfully', async ({ operation, script, verify }) => { mockContext.config = { operation, language: 'javascript', script } as any const result = await executeTransformNode(mockContext) expect(result.success).toBe(true) expect(result.output).toMatchObject({ operation }) verify(result.output) })Also applies to: 175-208
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
📒 Files selected for processing (1)
nodes/TransformNode/TransformNode.test.ts(1 hunks)
🔇 Additional comments (6)
nodes/TransformNode/TransformNode.test.ts (6)
16-28: Validation of required fields looks goodCovers empty config, missing fields, and empty script. Solid baseline.
52-61: JS syntax validation assertion is robustUsing startsWith to check the error category keeps this resilient while remaining precise.
63-72: Valid config path is well-coveredHappy path assertion reads clearly and maps to the schema contract.
104-119: MAP execution test is meaningfulGood coverage of success flag, original/transformed data, items processed, and duration type.
192-208: MERGE execution test reads wellClear check for merged object and items processed.
268-277: Abort signal handling test is solidChecks success flag and cancellation message. Looks good.
- Remove duplicate espree dependency from devDependencies - Keep espree in dependencies due to runtime import in TransformNode.schema.ts - Fix unsafe 'any' type usage in credential-store.test.ts - Update package-lock.json accordingly
- Fix DatabaseNode test to expect correct validation error message - Skip database integration test in CI when TEST_DB_URL is not available - All 269 tests now pass successfully
There was a problem hiding this comment.
Actionable comments posted: 3
♻️ Duplicate comments (5)
nodes/DelayNode/DelayNode.service.ts (2)
11-16: Allow zero-delay values to align with schema validation (non-negative).Schema permits value = 0; the service rejects it (<= 0). Align runtime validation with schema.
- if (typeof config.value !== 'number' || config.value <= 0) { + if (typeof config.value !== 'number' || config.value < 0) { return { success: false, - error: 'Valid delay value is required' + error: 'Delay value must be a non-negative number' } }
39-55: Wrap switch-case declarations in blocks to satisfy noSwitchDeclarations and prevent leakage.Local consts declared in case branches should be block-scoped to avoid cross-case access and pass Biome.
- case 'random': - const maxDelay = Math.max(0, config.maxDelayMs ?? (baseDelayMs * 2)) - actualDelayMs = Math.random() * maxDelay - break + case 'random': { + const maxDelay = Math.max(0, config.maxDelayMs ?? baseDelayMs * 2) + actualDelayMs = Math.random() * maxDelay + break + } - case 'exponential': - // Simple exponential backoff simulation - const maxExp = config.maxDelayMs || baseDelayMs * 4 - const exponentialDelay = baseDelayMs * Math.pow(2, Math.random() * 3) - actualDelayMs = Math.min(exponentialDelay, maxExp) - break + case 'exponential': { + // Simple exponential backoff simulation + const maxExp = config.maxDelayMs || baseDelayMs * 4 + const exponentialDelay = baseDelayMs * Math.pow(2, Math.random() * 3) + actualDelayMs = Math.min(exponentialDelay, maxExp) + break + }nodes/DelayNode/DelayNode.schema.ts (1)
108-116: Align schema and service on zero-delay acceptance.Schema allows value >= 0 (Line 109), while the service rejects zero and clamps to min 1ms. Align both to avoid surprises.
Also applies to: 63-65
nodes/DatabaseNode/DatabaseNode.test.ts (1)
41-52: Add a positive test for valid JSON-string parameters pathWe currently only assert that a non-JSON string fails. Add a test that supplies a valid JSON string to ensure the string-to-object path remains exercised and protected against regressions.
Apply this diff to add a positive test just after this block:
@@ it('should validate parameters as object', () => { const config = { operation: 'select', credentialId: 'test-credential-id', query: 'SELECT *', parameters: 'invalid parameters' } const errors = DATABASE_NODE_DEFINITION.validate(config) expect(errors.length).toBeGreaterThan(0) expect(errors).toContain('Invalid parameters JSON: Unable to parse JSON string') }) + + it('should accept parameters as valid JSON string and parse it', () => { + const config = { + operation: 'select', + credentialId: 'test-credential-id', + query: 'SELECT *', + parameters: '{"limit":10,"tags":["a","b"]}' + } + const errors = DATABASE_NODE_DEFINITION.validate(config) + expect(errors).toHaveLength(0) + // Avoid relying on side-effects inside validate(); independently confirm the JSON is valid + const parsed = JSON.parse(config.parameters as string) + expect(parsed).toMatchObject({ limit: 10, tags: ['a', 'b'] }) + })components/workflow/node-config-panel.tsx (1)
343-390: Previous unsafe access of showIf is now guarded — good fix.The runtime checks around param.showIf and the condition structure resolve the earlier ESLint-disabled unsafe access.
🧹 Nitpick comments (34)
nodes/TransformNode/TransformNode.schema.ts (3)
54-61: “Input Path” says JSONPath but implementation uses dot-notationService code reads paths via simple dot-notation (e.g., data.items). Label/description here say “JSONPath”, which is misleading.
Recommend updating the description to reflect dot-notation, or implement true JSONPath support end-to-end.
placeholder: "data.items", - description: "JSONPath to extract input data (optional)", + description: "Dot-notation path to extract input data (optional)",
77-85: Avoid drift: centralize allowed operations/languages and reuse hereThe allowed values are hardcoded. They also need to be known by the service. Centralize them as exported consts to prevent drift and reuse in both schema and service.
- const validOperations = ["map", "filter", "reduce", "sort", "group", "merge"]; + const validOperations = VALID_TRANSFORM_OPERATIONS; if (!typed.operation || !validOperations.includes(typed.operation)) { errors.push("Valid operation is required"); } - const validLanguages = ["javascript", "jsonpath"]; + const validLanguages = VALID_SCRIPT_LANGUAGES; if (!typed.language || !validLanguages.includes(typed.language)) { errors.push("Valid script language is required"); }Add these near the top of the file (and export them for the service to import):
export const VALID_TRANSFORM_OPERATIONS = ['map','filter','reduce','sort','group','merge'] as const export const VALID_SCRIPT_LANGUAGES = ['javascript','jsonpath'] as const
88-92: Validate input/output paths for dangerous segmentsService-side setNestedValue guards prototype-pollution on write, but we don’t reject unsafe segments early in schema validation. Add a lightweight check to fail fast in the UI.
// Validate script if (!typed.script || typeof typed.script !== "string" || typed.script.trim().length === 0) { errors.push("Transformation script is required"); } + // Validate path safety (defense-in-depth) + const dangerous = ['__proto__', 'constructor', 'prototype'] + for (const key of ['inputPath','outputPath'] as const) { + const p = typed[key] + if (typeof p === 'string') { + const parts = p.split('.') + if (parts.some(part => dangerous.includes(part))) { + errors.push(`Invalid ${key}: contains forbidden segment`) + } + } + }nodes/TransformNode/TransformNode.service.ts (5)
26-31: Script/language are ignored; “JSONPath” vs dot-notation mismatch
- inputPath is resolved via a simple dot-path helper, not JSONPath.
- script and language are never used (placeholder logic).
Either:
- Document that this is a placeholder using dot-notation only (and update the schema descriptions), or
- Implement language-aware resolution/execution (e.g., JSONPath via jsonpath-plus; JS via a safe sandbox on the server).
Example for JSONPath input extraction (if adopting jsonpath-plus):
// import { JSONPath } from 'jsonpath-plus' // const inputData = config.inputPath // ? JSONPath({ path: config.inputPath, json: context.input }) // : context.inputNote: executing arbitrary JavaScript should be done server-side in a sandbox, never directly in the client/runtime without strict isolation.
Also applies to: 32-38
56-69: Reduce per-item overhead in map: single timestamp and cheaper countingAvoid creating a new Date per item and incrementing a counter inside the mapper. Compute once and set itemsProcessed from length.
case 'map': - if (Array.isArray(inputData)) { - transformedData = inputData.map((item, index) => { - itemsProcessed++ - // Mock transformation - add a "processed" flag - return typeof item === 'object' && item !== null - ? { ...(item as Record<string, unknown>), processed: true, transformedAt: new Date().toISOString() } - : { value: item as unknown, processed: true, transformedAt: new Date().toISOString() } - }) + if (Array.isArray(inputData)) { + const nowIso = new Date().toISOString() + transformedData = inputData.map((item) => { + // Mock transformation - add a "processed" flag + return typeof item === 'object' && item !== null + ? { ...(item as Record<string, unknown>), processed: true, transformedAt: nowIso } + : { value: item, processed: true, transformedAt: nowIso } + }) + itemsProcessed = inputData.length } else { - transformedData = typeof inputData === 'object' && inputData !== null - ? { ...inputData as Record<string, unknown>, processed: true, transformedAt: new Date().toISOString() } - : { value: inputData, processed: true, transformedAt: new Date().toISOString() } + const nowIso = new Date().toISOString() + transformedData = typeof inputData === 'object' && inputData !== null + ? { ...(inputData as Record<string, unknown>), processed: true, transformedAt: nowIso } + : { value: inputData, processed: true, transformedAt: nowIso } itemsProcessed = 1 }
72-83: Cheaper counting in filterCounting inside the predicate adds overhead and can be misleading if you later short-circuit. Set itemsProcessed from array length instead.
- transformedData = inputData.filter((item, index) => { - itemsProcessed++ - return item && (typeof item !== 'object' || Object.keys(item as object).length > 0) - }) + transformedData = inputData.filter((item) => { + return item && (typeof item !== 'object' || Object.keys(item as object).length > 0) + }) + itemsProcessed = inputData.length
151-157: Wrap output path write in try/catch for user-friendly errorssetNestedValue throws on dangerous segments. Catch and convert to a clear error for the node result.
if (config.outputPath) { const outputContainer = {} - setNestedValue(outputContainer, config.outputPath, transformedData) + try { + setNestedValue(outputContainer, config.outputPath, transformedData) + } catch (e) { + const msg = e instanceof Error ? e.message : 'Invalid output path' + return { success: false, error: msg } + } finalOutput = outputContainer }
208-218: DRY: Reuse a shared getNestedValue helperThis helper mirrors logic seen elsewhere (e.g., server/services/workflow-executor.ts). Consider extracting to a shared utility to avoid drift and improve test coverage.
lib/credential-store.test.ts (2)
47-54: Add a negative assertion to ensure no credential is stored when ID is already valid.This strengthens the test by guaranteeing we don’t inadvertently create credentials during a no-op migration.
it('should return credential ID if connection string is already a valid credential ID', () => { const existingCredentialId = 'cred_12345678-1234-1234-1234-123456789abc' // Mock isValidCredentialId to return true for this ID - vi.spyOn(credentialStore, 'isValidCredentialId').mockReturnValue(true) + vi.spyOn(credentialStore, 'isValidCredentialId').mockReturnValue(true) + const storeSpy = vi.spyOn(credentialStore, 'storeCredential') const result = migrateConnectionStringToCredential(existingCredentialId) expect(result).toBe(existingCredentialId) + expect(storeSpy).not.toHaveBeenCalled() })
56-73: Consider adding a test for default naming when name is omitted.Covers the fallback path and verifies the store is called with an auto-generated name.
Example test to add:
it('should use a default name if none is provided', () => { const connectionString = 'postgresql://user:pass@localhost:5432/db' const mockCredentialId = 'cred_auto_1' vi.spyOn(credentialStore, 'isValidCredentialId').mockReturnValue(false) const storeSpy = vi.spyOn(credentialStore, 'storeCredential').mockReturnValue(mockCredentialId) const result = migrateConnectionStringToCredential(connectionString) expect(storeSpy).toHaveBeenCalled() const [nameArg, valueArg, typeArg, descArg] = storeSpy.mock.calls[0] expect(typeof nameArg).toBe('string') expect(nameArg).toMatch(/^Database Connection /) expect(valueArg).toBe(connectionString) expect(typeArg).toBe('database') expect(descArg).toBe('Migrated from plain connection string') expect(result).toBe(mockCredentialId) })nodes/DelayNode/DelayNode.service.ts (2)
63-65: Consider allowing 0ms delays to remain zero instead of clamping to 1ms.If zero is valid per schema, clamping to 1ms changes semantics. Optional: adjust min bound to 0.
- actualDelayMs = Math.max(1, Math.min(actualDelayMs, 24 * 60 * 60 * 1000)) // 1ms to 24 hours + actualDelayMs = Math.max(0, Math.min(actualDelayMs, 24 * 60 * 60 * 1000)) // 0ms to 24 hours
73-99: Tighten abort listener cleanup and use an AbortError for clearer error typing.Avoid extra timer for cleanup; remove the listener on both success and abort, and set error.name to 'AbortError' for reliable detection.
- await new Promise<void>((resolve, reject) => { - const timer = setTimeout(() => { - resolve() - }, actualDelayMs) - - // Handle abort signal - if (context.signal) { - const abortHandler = () => { - clearTimeout(timer) - reject(new Error('Delay was cancelled')) - } - - if (context.signal.aborted) { - clearTimeout(timer) - reject(new Error('Delay was cancelled')) - return - } - - context.signal.addEventListener('abort', abortHandler, { once: true }) - - // Clean up event listener when promise resolves - timer && setTimeout(() => { - context.signal?.removeEventListener('abort', abortHandler) - }, actualDelayMs + 100) - } - }) + await new Promise<void>((resolve, reject) => { + let abortHandler: (() => void) | null = null + const timer = setTimeout(() => { + if (context.signal && abortHandler) { + context.signal.removeEventListener('abort', abortHandler) + } + resolve() + }, actualDelayMs) + + if (context.signal) { + abortHandler = () => { + clearTimeout(timer) + const err = new Error('Delay was cancelled') + err.name = 'AbortError' + reject(err) + } + + if (context.signal.aborted) { + clearTimeout(timer) + const err = new Error('Delay was cancelled') + err.name = 'AbortError' + reject(err) + return + } + + context.signal.addEventListener('abort', abortHandler, { once: true }) + } + })nodes/DelayNode/DelayNode.schema.ts (3)
4-33: Don’t duplicate core types; import shared NodeDefinition/ParameterDefinition.Local copies can drift from the canonical definitions in nodes/index.ts (e.g., missing credential/stringList types). Import instead for consistency and type safety.
-import { DelayNodeConfig, getDelayMs } from "./DelayNode.types"; - -interface ParameterDefinition { - name: string; - label: string; - type: - | "text" - | "textarea" - | "select" - | "number" - | "boolean" - | "email" - | "url" - | "json" - | "password"; - required?: boolean; - defaultValue?: unknown; - options?: Array<{ label: string; value: string }>; - placeholder?: string; - description?: string; - showIf?: Array<{ path: string; equals: string | number | boolean }>; -} - -interface NodeDefinition { - nodeType: NodeType; - subType: ActionType; - label: string; - description: string; - parameters: ParameterDefinition[]; - validate: (config: Record<string, unknown>) => string[]; - getDefaults: () => DelayNodeConfig; -} +import { DelayNodeConfig, getDelayMs } from "./DelayNode.types"; +import type { ParameterDefinition, NodeDefinition } from "../index";
39-39: Update user-facing description (remove “placeholder implementation”).Reads better in the UI and avoids implying incomplete behavior.
- description: "Add a delay/wait period in workflow execution (placeholder implementation)", + description: "Add a delay/wait period in workflow execution",
84-87: Confirm showIf semantics (OR vs AND) for maxDelayMs visibility.Two showIf entries often imply AND, which would never match both values simultaneously. If the UI expects OR, consolidate or use the supported OR mechanism.
hooks/use-workflow-store.ts (3)
12-50: Centralized encryption helper looks good; consider addressing HTTP secrets.HTTP configs may carry Authorization/API keys. Consider migrating those to the credential store instead of persisting as-is.
Would you like a follow-up PR to add an HTTP credential abstraction and migrate headers/tokens to the store?
52-126: Decryption helper correctly applies migration first; add unit tests.The order (migrate -> decrypt) is solid. Add focused tests to ensure idempotency and safe fallbacks across EMAIL/DATABASE/HTTP/TRANSFORM/DELAY.
I can scaffold vitest suites for encryptNodeConfig/decryptNodeConfig round-trips and migration edge cases.
259-261: Initialize the new-workflow draft with the encrypted node shape for consistency.Empty nodes today make this benign, but aligning the shape avoids future surprises.
- sessionStorage.setItem('lastOpenedWorkflowId', newWorkflow.id) - sessionStorage.setItem('workflowDraft', JSON.stringify(newWorkflow)) + sessionStorage.setItem('lastOpenedWorkflowId', newWorkflow.id) + const encryptedDraft = { ...newWorkflow, nodes: newWorkflow.nodes.map(encryptNodeConfig) } + sessionStorage.setItem('workflowDraft', JSON.stringify(encryptedDraft))lib/legacy-migration-helper.ts (4)
9-9: Remove unused import (needsDelayMigration).Avoids linter errors for unused symbols.
-import { needsDatabaseMigration, needsDelayMigration, migrateWorkflowNode } from './migration-utils' +import { needsDatabaseMigration, migrateWorkflowNode } from './migration-utils'
24-35: scanWorkflowForLegacyConfigs returns at most one issue per node—by design?If you want to accumulate multiple findings per node (e.g., both “needs migration” and “has both fields”), consider returning an array from the node scanner and concatenating here.
127-172: Migration runner is straightforward; consider removing console.log or gating it.Logging per migrated node may be noisy in production. Use a debug logger or a verbose flag.
255-262: Deprecation warnings: good surface. Consider exposing a single multi-line message for logs.Optionally provide a helper that returns a single string for logging without joining externally.
nodes/DatabaseNode/DatabaseNode.schema.ts (3)
7-28: Deduplicate schema types: import NodeDefinition/ParameterDefinition instead of redefiningRe-declaring these interfaces risks drift from the shared types in nodes/index.ts (and breaks global consistency, e.g., allowed parameter "type" values). Prefer importing the shared types.
Apply the following diffs:
- Remove the local ParameterDefinition and import the shared types:
@@ -import { CredentialType } from "@/types/credentials"; +import { CredentialType } from "@/types/credentials"; +import type { NodeDefinition, ParameterDefinition } from "../index"; @@ -interface ParameterDefinition { - name: string; - label: string; - type: - | "text" - | "textarea" - | "select" - | "number" - | "boolean" - | "email" - | "url" - | "json" - | "password" - | "credential"; - required?: boolean; - defaultValue?: unknown; - options?: Array<{ label: string; value: string }> | (() => Array<{ label: string; value: string }>); - placeholder?: string; - description?: string; - showIf?: Array<{ path: string; equals: string | number | boolean }>; - credentialType?: CredentialType; -}
- Remove the local NodeDefinition interface and rely on the shared one:
@@ -interface NodeDefinition<T = Record<string, unknown>> { - nodeType: NodeType; - subType: ActionType; - label: string; - description: string; - parameters: ParameterDefinition[]; - validate: (config: T) => string[]; - getDefaults: () => DatabaseNodeConfig; -}
- Tighten the type of the exported definition to the shared generic:
- export const DATABASE_NODE_DEFINITION: NodeDefinition = { + export const DATABASE_NODE_DEFINITION: NodeDefinition<DatabaseNodeConfig> = {Also applies to: 79-87, 89-90
45-52: Relax type guard: let validate() emit human-friendly credential errorsRequiring credentialId/connectionString in the type guard forces a generic "Invalid configuration structure" and short-circuits the more actionable "Database credential is required" from validateDatabaseNodeConfig. Allow missing credentials in the guard and defer that check to validate().
Apply this diff to remove the hard requirement from the guard:
@@ - // Check credentialId field (new) or connectionString field (legacy) - const hasCredentialId = typeof obj.credentialId === 'string' && obj.credentialId.trim().length > 0; - const hasConnectionString = typeof obj.connectionString === 'string' && obj.connectionString.trim().length > 0; - - if (!hasCredentialId && !hasConnectionString) { - return false; - } + // Note: Do not enforce presence of credentialId/connectionString here. + // Credential presence and correctness are validated in validate() via validateDatabaseNodeConfig().
170-199: Avoid mutating config inside validate()validate() currently replaces typed.parameters when it is a string. Validation should be pure. Parse to validate shape, but avoid writing back.
Apply this diff to remove the mutation:
@@ - } else { - // If we parsed from string, replace the original value for subsequent validation - if (typeof typed.parameters === 'string') { - typed.parameters = parsedParameters as Record<string, unknown>; - } - } + }nodes/DelayNode/DelayNode.types.ts (1)
14-33: Tighten typing and validate non-negative delays in getDelayMsUse the unit union for stronger type-safety and reject negative (or NaN) values explicitly.
Apply this diff:
- export function getDelayMs(config: { value: number; unit: string }): number { + export function getDelayMs(config: { value: number; unit: DelayNodeConfig['unit'] }): number { @@ - const delayMs = config.value * unitMultiplier - if (!Number.isFinite(delayMs)) { + if (typeof config.value !== 'number' || Number.isNaN(config.value)) { + throw new Error('Invalid delay value') + } + if (config.value < 0) { + throw new Error('Delay cannot be negative') + } + const delayMs = config.value * unitMultiplier + if (!Number.isFinite(delayMs)) { throw new Error('Invalid computed delay value') }lib/migration-utils.ts (1)
5-9: Use NodeType constant instead of string literal for nodeType checkThis keeps comparisons consistent with the rest of the codebase (e.g., DatabaseNode.schema.ts uses NodeType.ACTION).
Apply these diffs:
@@ -import { WorkflowNode, ActionType } from '@/types/workflow' +import { WorkflowNode, ActionType, NodeType } from '@/types/workflow'@@ - if (node.data.nodeType !== 'action') { + if (node.data.nodeType !== NodeType.ACTION) { return node }Also applies to: 54-59
lib/credential-store.ts (2)
115-115: Provide a robust UUID fallback for broader runtime supportcrypto.randomUUID() may be unavailable in some environments. Add a minimal fallback to avoid runtime errors.
Apply this diff:
- const id = `cred_${crypto.randomUUID()}` + const uuid = (globalThis.crypto && 'randomUUID' in globalThis.crypto) + ? (globalThis.crypto as Crypto).randomUUID() + : Math.random().toString(36).slice(2) + const id = `cred_${uuid}`
24-26: Consider pluggable storage and environment detection for sessionStorageRelying on sessionStorage ties the store to a browser context and can fail under SSR, workers, or restricted environments. Provide a storage adapter (sessionStorage | localStorage | in-memory), selected at runtime, and guard accesses to avoid ReferenceErrors. This will also simplify testing.
Would you like me to sketch a lightweight StorageAdapter interface and a default in-memory fallback?
tests/workflow-id-validation.test.ts (2)
111-124: Add boundary tests for min/max length.These will protect against regressions of the 3–64 rule and very long inputs.
describe('Edge cases', () => { it('should handle trimming correctly', () => { expect(isValidWorkflowId(' valid ')).toBe(true) expect(isValidWorkflowId(' valid123 ')).toBe(true) expect(isValidWorkflowId(' _invalid ')).toBe(false) }) it('should handle mixed case correctly', () => { expect(isValidWorkflowId('MyWorkFlow123')).toBe(true) expect(isValidWorkflowId('MY_WORK_FLOW')).toBe(true) expect(isValidWorkflowId('my-Work-Flow')).toBe(true) }) + + it('should enforce 3–64 character length', () => { + expect(isValidWorkflowId('ab')).toBe(false) + expect(isValidWorkflowId('a'.repeat(65))).toBe(false) + expect(isValidWorkflowId('abc')).toBe(true) + expect(isValidWorkflowId('a'.repeat(64))).toBe(true) + }) })
18-27: Reserved names set is out-of-sync with the UI.This test includes dev/prod, while the UI includes demo/settings (and other overlaps). Centralizing the list (previous comment) will eliminate drift. If you keep it here, update it to match the UI’s set exactly or add an assertion that it matches the shared list.
Also applies to: 99-108
components/workflow/node-config-panel.tsx (3)
420-433: Use placeholders for placeholders (not descriptions).Several inputs use description text as the placeholder. Swap to getSafePlaceholder with sensible fallbacks to keep semantics consistent across field types.
case 'string': case 'text': { // Allow both 'string' and 'text' parameter types for compatibility const paramPath = param.path const value = getParamValue(paramPath, 'string', param.default) const description = getSafeDescription(param.description) + const placeholder = getSafePlaceholder(param.placeholder) return ( <div key={paramPath} className="space-y-1.5 sm:space-y-2"> <FieldLabel text={param.label} description={description} htmlFor={paramPath} /> <Input value={String(value)} onChange={(e) => handleConfigChange(paramPath, e.target.value)} - placeholder={description} + placeholder={placeholder || description} className="bg-white text-gray-900 placeholder:text-gray-400 border-gray-300" /> </div> ) } case 'textarea': { const value = getParamValue(param.path, 'string', param.default) - const description = getSafeDescription(param.description) + const description = getSafeDescription(param.description) + const placeholder = getSafePlaceholder(param.placeholder) return ( <div key={param.path} className="space-y-1.5 sm:space-y-2"> <FieldLabel text={param.label} description={description} htmlFor={param.path} /> <textarea className="w-full p-2 border rounded-md bg-white text-gray-900 border-gray-300" rows={6} value={String(value)} onChange={(e) => handleConfigChange(param.path, e.target.value)} - placeholder={description} + placeholder={placeholder || description} /> </div> ) } // ... - placeholder={getSafeDescription(param.description) || '{}'} + placeholder={getSafePlaceholder(param.placeholder) || '{}'} // ... case 'stringList': { const arrayValue = getArrayValue<string>(selectedNode.data.config as Record<string, unknown>, param.path, []) - const description = getSafeDescription(param.description) + const description = getSafeDescription(param.description) + const placeholder = getSafePlaceholder(param.placeholder) || 'first@email.com, next@email.com' return ( <div key={param.path} className="space-y-1.5 sm:space-y-2"> <FieldLabel text={param.label} description={description} htmlFor={param.path} /> <Input value={arrayValue.map(String).join(', ')} onChange={(e) => handleConfigChange(param.path, e.target.value.split(',').map((s) => s.trim()).filter(Boolean))} - placeholder={description || 'first@email.com, next@email.com'} + placeholder={placeholder} className="bg-white text-gray-900 placeholder:text-gray-400 border-gray-300" /> </div> ) } case 'number': { const numberValue = getParamValue(param.path, 'number', param.default) - const description = getSafeDescription(param.description) + const description = getSafeDescription(param.description) + const placeholder = getSafePlaceholder(param.placeholder) return ( <div key={param.path} className="space-y-2"> <FieldLabel text={param.label} description={description} htmlFor={param.path} /> <Input type="number" value={numberValue === 0 ? '' : String(numberValue)} onChange={(e) => handleConfigChange(param.path, Number(e.target.value || 0))} - placeholder={description} + placeholder={placeholder || description} className="bg-white text-gray-900 placeholder:text-gray-400 border-gray-300" /> </div> ) }Also applies to: 438-452, 616-617, 622-631, 637-648
411-414: Guard against option-provider exceptions.If param.options is a function and throws, the whole panel crashes. Wrap in a try/catch and default to an empty list.
- {(typeof param.options === 'function' ? param.options() : param.options || []).map((opt: { label: string; value: string }) => ( + {(() => { + try { + return typeof param.options === 'function' ? param.options() : (param.options || []) + } catch { + return [] + } + })().map((opt: { label: string; value: string }) => ( <SelectItem key={opt.value} value={opt.value}>{opt.label}</SelectItem> ))}
71-85: Nit: function naming — “getSafeWorkflowIdFromUrl” returns encoded output.Since the function returns an already URI-encoded value, a name like getEncodedWorkflowIdFromUrl would be more explicit. Alternatively, return raw validated text and encode at the call site.
Also applies to: 1056-1056
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
💡 Knowledge Base configuration:
- MCP integration is disabled by default for public repositories
- Jira integration is disabled by default for public repositories
- Linear integration is disabled by default for public repositories
You can enable these sources in your CodeRabbit configuration.
⛔ Files ignored due to path filters (1)
package-lock.jsonis excluded by!**/package-lock.json
📒 Files selected for processing (20)
components/workflow/node-config-panel.tsx(9 hunks)hooks/use-workflow-store.ts(6 hunks)lib/credential-store.test.ts(1 hunks)lib/credential-store.ts(1 hunks)lib/legacy-migration-helper.ts(1 hunks)lib/migration-utils.ts(1 hunks)lib/node-definitions.ts(4 hunks)nodes/DatabaseNode/DatabaseNode.schema.ts(1 hunks)nodes/DatabaseNode/DatabaseNode.service.ts(1 hunks)nodes/DatabaseNode/DatabaseNode.test.ts(1 hunks)nodes/DelayNode/DelayNode.schema.ts(1 hunks)nodes/DelayNode/DelayNode.service.ts(1 hunks)nodes/DelayNode/DelayNode.test.ts(1 hunks)nodes/DelayNode/DelayNode.types.ts(1 hunks)nodes/TransformNode/TransformNode.schema.ts(1 hunks)nodes/TransformNode/TransformNode.service.ts(1 hunks)nodes/TransformNode/TransformNode.test.ts(1 hunks)package.json(2 hunks)tests/integration/workflow-execution.test.ts(1 hunks)tests/workflow-id-validation.test.ts(1 hunks)
🚧 Files skipped from review as they are similar to previous changes (6)
- nodes/TransformNode/TransformNode.test.ts
- tests/integration/workflow-execution.test.ts
- package.json
- lib/node-definitions.ts
- nodes/DatabaseNode/DatabaseNode.service.ts
- nodes/DelayNode/DelayNode.test.ts
🧰 Additional context used
🧬 Code Graph Analysis (13)
lib/credential-store.test.ts (1)
lib/credential-store.ts (3)
credentialStore(275-275)migrateConnectionStringToCredential(280-303)resolveConnectionString(308-320)
lib/migration-utils.ts (5)
nodes/DatabaseNode/DatabaseNode.types.ts (2)
DatabaseNodeConfig(3-15)DatabaseNodeConfig(31-31)lib/credential-store.ts (2)
migrateConnectionStringToCredential(280-303)credentialStore(275-275)types/workflow.ts (1)
WorkflowNode(73-73)nodes/DelayNode/DelayNode.types.ts (2)
DelayNodeConfig(3-9)DelayNodeConfig(51-51)nodes/DelayNode/index.ts (1)
DelayNodeConfig(4-4)
lib/legacy-migration-helper.ts (4)
types/workflow.ts (2)
Workflow(77-87)WorkflowNode(73-73)nodes/DatabaseNode/DatabaseNode.types.ts (2)
DatabaseNodeConfig(3-15)DatabaseNodeConfig(31-31)lib/migration-utils.ts (2)
needsDatabaseMigration(92-98)migrateWorkflowNode(54-87)nodes/DelayNode/DelayNode.types.ts (2)
DelayNodeConfig(3-9)DelayNodeConfig(51-51)
nodes/TransformNode/TransformNode.service.ts (4)
nodes/TransformNode/index.ts (3)
executeTransformNode(2-2)TransformNodeConfig(4-4)TransformExecutionResult(4-4)nodes/index.ts (2)
NodeExecutionContext(19-19)NodeExecutionResult(19-19)nodes/TransformNode/TransformNode.types.ts (3)
TransformNodeConfig(3-10)TransformNodeConfig(25-25)TransformExecutionResult(17-23)server/services/workflow-executor.ts (1)
getNestedValue(386-393)
nodes/DelayNode/DelayNode.schema.ts (3)
nodes/index.ts (3)
ParameterDefinition(31-45)NodeDefinition(49-71)NodeType(182-182)nodes/DelayNode/DelayNode.types.ts (3)
DelayNodeConfig(3-9)DelayNodeConfig(51-51)getDelayMs(14-33)nodes/DelayNode/index.ts (2)
DelayNodeConfig(4-4)DELAY_NODE_DEFINITION(3-3)
nodes/DatabaseNode/DatabaseNode.test.ts (5)
nodes/DatabaseNode/DatabaseNode.schema.ts (1)
DATABASE_NODE_DEFINITION(89-209)nodes/DatabaseNode/index.ts (4)
DATABASE_NODE_DEFINITION(3-3)DatabaseNodeConfig(4-4)executeDatabaseNode(2-2)DatabaseExecutionResult(4-4)nodes/index.ts (1)
NodeExecutionContext(19-19)nodes/DatabaseNode/DatabaseNode.types.ts (3)
DatabaseNodeConfig(3-15)DatabaseNodeConfig(31-31)DatabaseExecutionResult(22-29)nodes/DatabaseNode/DatabaseNode.service.ts (1)
executeDatabaseNode(30-185)
components/workflow/node-config-panel.tsx (4)
lib/type-safe-utils.ts (8)
getTypedParameterValue(214-256)pathValueEquals(125-132)getValueAtPath(16-37)getSafeDefaultValue(157-177)getSafeDescription(182-184)getObjectValue(90-97)getArrayValue(78-85)getSafePlaceholder(189-191)lib/node-definitions.ts (1)
findNodeDefinition(72-95)types/credentials.ts (2)
CredentialType(10-10)toCredentialType(27-29)components/ui/credential-selector.tsx (1)
CredentialSelector(217-320)
hooks/use-workflow-store.ts (3)
types/workflow.ts (1)
WorkflowNode(73-73)lib/security.ts (3)
encryptEmailConfig(112-138)decryptEmailConfig(143-169)decryptDatabaseConfig(103-107)lib/migration-utils.ts (1)
migrateWorkflowNode(54-87)
nodes/DelayNode/DelayNode.types.ts (1)
types/workflow.ts (1)
ActionNodeData(61-64)
nodes/DatabaseNode/DatabaseNode.schema.ts (5)
nodes/index.ts (3)
ParameterDefinition(31-45)NodeDefinition(49-71)NodeType(182-182)types/credentials.ts (1)
CredentialType(10-10)nodes/DatabaseNode/DatabaseNode.types.ts (2)
DatabaseNodeConfig(3-15)DatabaseNodeConfig(31-31)lib/credential-store.ts (1)
credentialStore(275-275)lib/migration-utils.ts (1)
validateDatabaseNodeConfig(103-130)
lib/credential-store.ts (2)
types/credentials.ts (1)
CredentialType(10-10)lib/security.ts (3)
decryptCredential(59-85)isEncrypted(205-211)encryptCredential(41-54)
nodes/TransformNode/TransformNode.schema.ts (2)
nodes/index.ts (2)
NodeDefinition(49-71)NodeType(182-182)nodes/TransformNode/TransformNode.types.ts (2)
TransformNodeConfig(3-10)TransformNodeConfig(25-25)
nodes/DelayNode/DelayNode.service.ts (3)
nodes/DelayNode/index.ts (3)
executeDelayNode(2-2)DelayNodeConfig(4-4)DelayExecutionResult(4-4)nodes/index.ts (2)
NodeExecutionContext(19-19)NodeExecutionResult(19-19)nodes/DelayNode/DelayNode.types.ts (4)
DelayNodeConfig(3-9)DelayNodeConfig(51-51)getDelayMs(14-33)DelayExecutionResult(40-49)
🪛 Biome (2.1.2)
nodes/DelayNode/DelayNode.service.ts
[error] 45-45: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.
The declaration is defined in this switch clause:
Safe fix: Wrap the declaration in a block.
(lint/correctness/noSwitchDeclarations)
[error] 51-51: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.
The declaration is defined in this switch clause:
Safe fix: Wrap the declaration in a block.
(lint/correctness/noSwitchDeclarations)
[error] 52-52: Other switch clauses can erroneously access this declaration.
Wrap the declaration in a block to restrict its access to the switch clause.
The declaration is defined in this switch clause:
Safe fix: Wrap the declaration in a block.
(lint/correctness/noSwitchDeclarations)
🔇 Additional comments (11)
nodes/TransformNode/TransformNode.schema.ts (1)
93-107: LGTM: espree-based static JS syntax validationGood security-minded approach: parsing for syntax only without executing user code. Wrapping as a function body is a sensible strategy.
nodes/TransformNode/TransformNode.service.ts (3)
159-166: LGTM: abort re-check before returningGood follow-up abort check after transformation work; complements the earlier post-delay check.
224-230: LGTM: prototype pollution guardExplicitly blocking proto/constructor/prototype is a solid defensive measure.
2-2: Import path for NodeExecution types is correct
- The interfaces are defined in
nodes/types.tsand exported there.nodes/index.tsre-exports them, but importing directly from../typesinTransformNode.service.tscorrectly resolves tonodes/types.ts.- No changes are required for this import statement.
hooks/use-workflow-store.ts (4)
187-187: LGTM: encrypting nodes before draft save.
221-221: LGTM: decrypting nodes on workflow load.
235-235: LGTM: draft initialization uses encrypted nodes.
271-271: LGTM: encrypting nodes before persistence and syncing to server.nodes/DatabaseNode/DatabaseNode.test.ts (1)
188-200: Confirm intended error message for whitespace-only credentialIdThis expects "Connection string is required" when credentialId is whitespace and connectionString is empty. The schema-level validator yields "Database credential is required" in similar scenarios, whereas the service hits resolveConnectionString and surfaces "Connection string is required". Confirm that this discrepancy is intentional or consider harmonizing the message.
lib/credential-store.ts (1)
74-93: Fail-closed decryption behavior looks solidgetCredentialValue correctly refuses to return ciphertext when decryption fails by using isEncrypted() as a heuristic and returning null. Good defensive posture.
components/workflow/node-config-panel.tsx (1)
200-209: Be cautious using null-prototype objects in app state.Object.create(null) is great for PP hardening, but some libs rely on Object.prototype (e.g., deep-equal utils using instanceof/Object.getPrototypeOf). Since this object ends up in React state and flows through utilities, verify nothing downstream breaks on null-proto objects.
You can quickly validate by grepping for instanceof Object / getPrototypeOf / hasOwnProperty calls on configs across the repo and ensuring they’re guarded against null-proto objects. I can provide a script if helpful.
Also applies to: 218-229, 235-236
Co-authored-by: coderabbitai[bot] <136622811+coderabbitai[bot]@users.noreply.github.com>
…ity enhancements
Summary
Describe the changes and the problem they solve.
Screenshots / Videos (if UI changes)
Checklist
npm run typechecknpm run lintnpm testnpm run buildREADME.md/docs/) as neededRelated Issues
Closes #
Summary by CodeRabbit
New Features
Bug Fixes
Chores