diff --git a/.site/docs/core-concepts/canvas.md b/.site/docs/core-concepts/canvas.md index e4114661..18573185 100644 --- a/.site/docs/core-concepts/canvas.md +++ b/.site/docs/core-concepts/canvas.md @@ -9,12 +9,15 @@ This guide walks through Stem's task composition primitives—chains, groups, an chords—using in-memory brokers and backends. Each snippet references a runnable file under `packages/stem/example/docs_snippets/` so you can experiment locally with `dart run`. If you bootstrap with `StemApp`, use `app.canvas` to reuse the -same broker, backend, task handlers, and encoder registry. +same broker, backend, task handlers, and encoder registry. `StemApp` lazy-starts +its managed worker for canvas dispatch too, so the common path does not need an +explicit `await app.start()`. ## Chains Chains execute tasks serially. Each step receives the previous result via -`context.meta['chainPrevResult']`. +`context.meta`, so prefer typed reads like +`context.meta.valueOr('chainPrevResult', 'fallback')` over raw casts. ```dart file=/../packages/stem/example/docs_snippets/lib/canvas_chain.dart#canvas-chain @@ -46,16 +49,19 @@ state: ## Chords -Chords combine a group with a callback. Once all body tasks succeed, the callback -runs with `context.meta['chordResults']` populated. +Chords combine a group with a callback. Once all body tasks succeed, the +callback runs with `context.meta['chordResults']` populated. Prefer +`context.meta.valueListOr('chordResults', const [])` over manual list casts +when reading those results. ```dart file=/../packages/stem/example/docs_snippets/lib/canvas_chord.dart#canvas-chord ``` If any branch fails, the callback is skipped and the chord group is marked as -failed. Inspect `backend.getGroup(chordId)` to see which branch failed before -retrying. +failed. Inspect the latest group status via `StemApp.getGroupStatus(...)` or +`StemClient.getGroupStatus(...)` before retrying. If you are operating below +the runtime layer, read the raw backend directly. ## Dependency semantics @@ -71,7 +77,10 @@ retrying. - `Canvas.group` returns a `GroupDispatch` with a result stream for each child. - `Canvas.chord` preserves the original signature order when building `chordResults`, so you can map results back to inputs deterministically. -- `backend.getGroup(groupId)` returns the latest status for each child task. +- `StemApp.getGroupStatus(...)` and `StemClient.getGroupStatus(...)` return the + latest status for each child task. Use `status.resultValues()` for scalar + child results or `status.resultJson(...)` / `status.resultAs(codec: ...)` for + DTO payloads before dropping down to raw backend reads. ## Removal semantics @@ -89,8 +98,8 @@ dart run lib/canvas_group.dart dart run lib/canvas_chord.dart ``` -Each script bootstraps a `StemApp` in-memory runtime, starts a worker, and then -uses `app.canvas` for composition. +Each script bootstraps a `StemApp` in-memory runtime and then uses `app.canvas` +for composition. ## Best practices diff --git a/.site/docs/core-concepts/cli-control.md b/.site/docs/core-concepts/cli-control.md index e1fb99bb..4084d11b 100644 --- a/.site/docs/core-concepts/cli-control.md +++ b/.site/docs/core-concepts/cli-control.md @@ -119,7 +119,7 @@ ensure the CLI and workers share the same task-definition entrypoint so task names, encoders, and routing rules stay consistent. A common pattern is to build that CLI registry from the same shared task list -or generated `stemModule.tasks` your app uses, so task metadata stays consistent +or generated `stemModule` your app uses, so task metadata stays consistent without teaching registry-first bootstrap for normal services. If a command needs a registry and none is available, it will exit with an error diff --git a/.site/docs/core-concepts/observability.md b/.site/docs/core-concepts/observability.md index 308002c5..2a590e95 100644 --- a/.site/docs/core-concepts/observability.md +++ b/.site/docs/core-concepts/observability.md @@ -58,6 +58,19 @@ control-plane commands. ``` +When you inspect `TaskPostrunPayload` or `TaskSuccessPayload` directly, prefer +`payload.resultJson(...)`, `payload.resultVersionedJson(...)`, or +`payload.resultAs(codec: ...)` over manual +`payload.result as Map` casts. +For workflow lifecycle signals, prefer +`payload.metadataJson('key', ...)`, +`payload.metadataVersionedJson('key', ...)`, or +`payload.metadataAs('key', codec: ...)` over manual +`payload.metadata['key'] as Map` casts. If the entire +metadata map is one DTO, use `payload.metadataPayloadJson(...)`, +`payload.metadataPayloadVersionedJson(...)`, or +`payload.metadataPayloadAs(codec: ...)` instead. + ## Workflow Introspection Workflow runtimes can emit execution events (started/completed/failed/retrying) @@ -80,6 +93,25 @@ class LoggingWorkflowIntrospectionSink implements WorkflowIntrospectionSink { } ``` +When a completed step or checkpoint carries a DTO payload, prefer +`event.resultJson(...)`, `event.resultVersionedJson(...)`, or +`event.resultAs(codec: ...)` over manual +`event.result as Map` casts. +Step and runtime introspection events also expose typed metadata helpers via +`event.metadataJson('key', ...)`, `event.metadataVersionedJson('key', ...)`, +`event.metadataAs('key', codec: ...)`, `event.metadataPayloadJson(...)`, and +`event.metadataPayloadVersionedJson(...)`. +When worker events carry structured `data`, prefer `event.dataJson(...)`, +`event.dataVersionedJson(...)`, or `event.dataAs(codec: ...)` over manual +`event.data!['key']` casts. For completed control commands, use +`payload.responseJson(...)`, `payload.responseVersionedJson(...)`, +`payload.responseAs(codec: ...)`, `payload.errorJson(...)`, +`payload.errorVersionedJson(...)`, or `payload.errorAs(codec: ...)` instead of +walking raw `response` / `error` maps. +Persisted worker heartbeats expose the same typed decode path on `extras` via +`heartbeat.extrasJson(...)`, `heartbeat.extrasVersionedJson(...)`, and +`heartbeat.extrasAs(codec: ...)`. + ## Logging Use `stemLogger` (Contextual logger) for structured logs. @@ -88,6 +120,11 @@ Use `stemLogger` (Contextual logger) for structured logs. ``` +The shared `stemLogger` starts silent by default, so opt in explicitly with +`configureStemLogging(level: Level.info, format: StemLogFormat.pretty)`. +When you want machine-oriented output for production log shipping, switch to +`configureStemLogging(format: StemLogFormat.plain)`. + Workers automatically include attempt, queue, and worker id in log contexts when `StemSignals` are enabled. diff --git a/.site/docs/core-concepts/persistence.md b/.site/docs/core-concepts/persistence.md index 905fdd9e..63956347 100644 --- a/.site/docs/core-concepts/persistence.md +++ b/.site/docs/core-concepts/persistence.md @@ -9,6 +9,11 @@ Use persistence when you need durable task state, workflow state, shared schedules, or revocation storage. Stem ships with Redis, Postgres, and SQLite adapters plus in-memory variants for local development. +For the normal path, prefer `StemClient.inMemory(...)`, +`StemClient.fromUrl(...)`, or a reusable `StemStack.fromUrl(...).createClient(...)`. +Drop to `StemClient.create(...)` only when you really need custom broker or +backend factories that the adapter stack cannot express. + import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; diff --git a/.site/docs/core-concepts/producer.md b/.site/docs/core-concepts/producer.md index 6b5b647c..168b318c 100644 --- a/.site/docs/core-concepts/producer.md +++ b/.site/docs/core-concepts/producer.md @@ -5,13 +5,18 @@ sidebar_position: 2 slug: /core-concepts/producer --- -Enqueue tasks from your Dart services using `Stem.enqueue`. Start with the -in-memory broker, then opt into Redis/Postgres as needed. +Enqueue tasks from your Dart services through a `TaskEnqueuer` surface such as +`StemClient`, `StemApp`, or `StemWorkflowApp`. Start with the in-memory broker, +then opt into Redis/Postgres as needed. + +For adapter-backed deployments, prefer `StemClient.fromUrl(...)` or +`StemStack.fromUrl(...).createClient(...)`. Keep `StemClient.create(...)` for +the rarer case where you must provide custom factories directly. import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; -## Enqueue tasks +## Raw Enqueue @@ -39,19 +44,30 @@ import TabItem from '@theme/TabItem'; ## Typed Enqueue Helpers -When you need compile-time guarantees for task arguments and result types, wrap -your handler in a `TaskDefinition`. The definition knows how to encode args and -decode results, and exposes a fluent builder for overrides (headers, meta, -options, scheduling): +For the common producer path, prefer `TaskDefinition`. The +definition owns argument encoding, result decoding, and default publish +metadata, while exposing direct helpers and a fluent builder for overrides +(headers, meta, options, scheduling): ```dart title="bin/producer_typed.dart" file=/../packages/stem/example/docs_snippets/lib/producer.dart#producer-typed ``` Typed helpers are also available on `Canvas` (`definition.toSignature`) so -group/chain/chord APIs produce strongly typed `TaskResult` streams. -Need to tweak headers/meta/queue at call sites? Wrap the definition in a -`TaskEnqueueBuilder` and invoke `await builder.enqueueWith(stem);`. +group/chain/chord APIs produce strongly typed `TaskResult` streams. Need to +tweak headers/meta/queue at call sites? Start from +`definition.buildCall(args, ...)` when you need the explicit advanced +transport path. + +Raw task-name strings still work, but they are the lower-level interop path. +Reach for them when the task name is truly dynamic or you are crossing a +boundary that does not have the generated/manual `TaskDefinition`. When those +calls already have typed DTO args, prefer +`enqueuer.enqueueValue(name, dto, codec: ...)` over hand-building an `args` +map. +If you later inspect the raw `Envelope`, prefer `envelope.argsJson(...)`, +`envelope.argsVersionedJson(...)`, `envelope.metaJson(...)`, or +`envelope.metaVersionedJson(...)` over manual map casts. ## Enqueue options @@ -68,7 +84,7 @@ unsupported. Example: ```dart -await stem.enqueue( +await enqueuer.enqueue( 'tasks.email', args: {'to': 'ops@example.com'}, enqueueOptions: TaskEnqueueOptions( @@ -86,7 +102,8 @@ await stem.enqueue( ## Tips -- Reuse a single `Stem` instance; create it during application bootstrap. +- Reuse a single `TaskEnqueuer` implementation; in most apps that means + `StemClient`, `StemApp`, or `StemWorkflowApp`. - Capture the returned task id when you need to poll status from the result backend. - Use `TaskOptions` to set queue, retries, priority, isolation, and visibility timeouts. - `meta` is stored with result backend entries—great for audit trails. diff --git a/.site/docs/core-concepts/queue-events.md b/.site/docs/core-concepts/queue-events.md index 6fa6b2cd..86d2d3b7 100644 --- a/.site/docs/core-concepts/queue-events.md +++ b/.site/docs/core-concepts/queue-events.md @@ -14,11 +14,22 @@ Use this when you need lightweight event streams for domain notifications ## API Surface - `QueueEventsProducer.emit(queue, eventName, payload, headers, meta)` +- `QueueEventsProducer.emitValue(queue, eventName, value, codec, headers, meta)` +- `QueueEventsProducer.emitJson(queue, eventName, dto, headers, meta)` +- `QueueEventsProducer.emitVersionedJson(queue, eventName, dto, version, headers, meta)` - `QueueEvents.start()` / `QueueEvents.close()` - `QueueEvents.events` stream (all events for that queue) - `QueueEvents.on(eventName)` stream (filtered by name) All events are delivered as `QueueCustomEvent`, which implements `StemEvent`. +Use `event.payloadValue(...)` / `event.requiredPayloadValue(...)` to read typed +payload fields instead of repeating raw `payload['key']` casts. +If one queue event maps to one DTO, use `event.payloadJson(...)`, +`event.payloadVersionedJson(...)`, or `event.payloadAs(codec: ...)` to decode +the whole payload in one step. +If the whole queue-event metadata map is one DTO, use `event.metaJson(...)`, +`event.metaVersionedJson(...)`, or `event.metaAs(codec: ...)` instead of +manual `event.meta[...]` casts. ## Producer + Listener @@ -38,6 +49,13 @@ Multiple listeners on the same queue receive each emitted event. - Events are queue-scoped: listeners receive only events for their configured queue. +- `emitValue(...)` is the codec-backed path when the payload should be + authored as a typed object but still use a custom map encoder or explicit + `PayloadCodec`. +- `emitJson(...)` is the DTO convenience path when the payload already exposes + `toJson()`. +- `emitVersionedJson(...)` is the same convenience path when the payload + schema should persist an explicit `__stemPayloadVersion`. - `on(eventName)` matches exact event names. - `headers` and `meta` round-trip to listeners. - Event names and queue names must be non-empty. diff --git a/.site/docs/core-concepts/signing.md b/.site/docs/core-concepts/signing.md index 219ce109..aaa22716 100644 --- a/.site/docs/core-concepts/signing.md +++ b/.site/docs/core-concepts/signing.md @@ -23,7 +23,8 @@ reason. ## How signing works in Stem - Producers create a `PayloadSigner` from environment-derived config and pass it - into `Stem` to sign new envelopes. + into the producer runtime (`StemClient` or raw `Stem`) to sign new + envelopes. - Workers create the same signer (or verification-only config) and pass it into `Worker` to verify each delivery. - Schedulers/Beat that enqueue tasks should also sign. @@ -46,8 +47,10 @@ export STEM_SIGNING_ACTIVE_KEY=v1 2) Wire the signer into producers, workers, and schedulers. -These snippets come from the `packages/stem/example/microservice` project so you can see the -full context. +These snippets come from the `packages/stem/example/microservice` project so you +can see the full context. They intentionally show the lower-level signing +plumbing; for the normal happy path, still prefer `StemClient`/`StemApp` +bootstrap. diff --git a/.site/docs/core-concepts/stem-builder.md b/.site/docs/core-concepts/stem-builder.md index 069fdb23..f745e275 100644 --- a/.site/docs/core-concepts/stem-builder.md +++ b/.site/docs/core-concepts/stem-builder.md @@ -53,7 +53,12 @@ class UserSignupWorkflow { } @TaskDefn(name: 'commerce.audit.log', runInIsolate: false) -Future logAudit(TaskInvocationContext ctx, String event, String id) async { +Future logAudit( + String event, + String id, { + TaskExecutionContext? context, +}) async { + final ctx = context!; ctx.progress(1.0, data: {'event': event, 'id': id}); } ``` @@ -68,69 +73,166 @@ Generated output (`workflow_defs.stem.g.dart`) includes: - `stemModule` - typed workflow refs like `StemWorkflowDefinitions.userSignup` -- typed task definitions, enqueue helpers, and typed result wait helpers +- typed task definitions whose advanced explicit transport path uses + `TaskCall` ## Wire Into StemWorkflowApp -Use the generated definitions/helpers directly with `StemWorkflowApp`: +Use the generated definitions/helpers directly through `StemClient`: ```dart -final workflowApp = await StemWorkflowApp.fromUrl( +final client = await StemClient.fromUrl( 'memory://', module: stemModule, ); +final workflowApp = await client.createWorkflowApp(); await workflowApp.start(); -final result = await StemWorkflowDefinitions.userSignup - .call((email: 'user@example.com')) - .startAndWaitWithApp(workflowApp); +final result = await StemWorkflowDefinitions.userSignup.startAndWait( + workflowApp, + params: 'user@example.com', +); +``` + +When you pass `module: stemModule`, the workflow app infers the worker +subscription from the workflow queue plus the default queues declared on the +bundled task handlers. Explicit subscriptions are still available for advanced +routing. + +If your service needs more than one generated or hand-written bundle, merge +them before bootstrap: + +```dart +final module = StemModule.merge([authModule, billingModule, stemModule]); +final client = await StemClient.inMemory(module: module); +final workflowApp = await client.createWorkflowApp(); +``` + +`StemModule.merge(...)` fails fast when modules declare conflicting task or +workflow names. + +If you do not want to pre-merge them yourself, bootstrap helpers also accept +`modules:` directly: + +```dart +final client = await StemClient.inMemory( + modules: [authModule, billingModule, stemModule], +); +final workflowApp = await client.createWorkflowApp(); +``` + +The same bundle-first path works for plain task apps too: + +```dart +final client = await StemClient.fromUrl( + 'redis://localhost:6379', + adapters: const [StemRedisAdapter()], + module: stemModule, +); +final taskApp = await client.createApp(); +``` + +If you need to attach generated or hand-written task definitions after +bootstrap, use the app helpers: + +- `registerTask(...)` / `registerTasks(...)` +- `registerModule(...)` / `registerModules(...)` + +When debugging bootstrap wiring, inspect the queue set a bundle implies before +you create the app: + +```dart +final queues = stemModule.requiredWorkflowQueues( + continuationQueue: 'workflow-continue', + executionQueue: 'workflow-step', +); +``` + +If you are wiring a worker manually, the module can also give you the exact +subscription directly: + +```dart +final subscription = stemModule.requiredWorkflowSubscription( + continuationQueue: 'workflow-continue', + executionQueue: 'workflow-step', +); ``` If you already manage a `StemApp` for a larger service, reuse it instead of bootstrapping a second app: ```dart -final stemApp = await StemApp.fromUrl( +final client = await StemClient.fromUrl( 'redis://localhost:6379', adapters: const [StemRedisAdapter()], - tasks: stemModule.tasks, + module: stemModule, + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription( + queues: ['workflow', 'default'], + ), + ), ); +final stemApp = await client.createApp(); -final workflowApp = await StemWorkflowApp.create( - stemApp: stemApp, +final workflowApp = await stemApp.createWorkflowApp(); +``` + +That shared-app path reuses the existing worker, so it only works when the +worker already covers the workflow queue plus the task queues your workflows +need. If you want automatic queue inference, prefer `StemClient`. + +For task-only services, the same bundle works directly with `StemApp`: + +```dart +final client = await StemClient.fromUrl( + 'redis://localhost:6379', + adapters: const [StemRedisAdapter()], module: stemModule, ); +final taskApp = await client.createApp(); ``` -If you already centralize broker/backend wiring in a `StemClient`, prefer the +Plain `StemApp` bootstrap infers task queue subscriptions from the bundled or +explicitly supplied task handlers when `workerConfig.subscription` is omitted, +and it lazy-starts on the first enqueue or wait call. + +If you already centralize broker/backend wiring in a `StemClient`, stay on the shared-client path: ```dart final client = await StemClient.fromUrl( 'redis://localhost:6379', adapters: const [StemRedisAdapter()], + module: stemModule, ); -final workflowApp = await client.createWorkflowApp(module: stemModule); +final workflowApp = await client.createWorkflowApp(); ``` +If you reuse an existing `StemApp`, its worker subscription remains your +responsibility. Workflow-side queue inference only applies when the workflow +app is creating the worker itself. + ## Parameter and Signature Rules -- Parameters after context must be required positional serializable values. -- Parameters after context must be required positional values that are either +- Business parameters must be required positional values that are either serializable or codec-backed DTOs. - Script workflow `run(...)` can be plain (no annotation required). -- `@WorkflowRun` is still supported for explicit run entrypoints. -- Step methods use `@WorkflowStep`. -- Plain `run(...)` is best when called step methods only need serializable +- Checkpoint methods use `@WorkflowStep`. +- Plain `run(...)` is best when called checkpoint methods only need + serializable parameters. -- Use `@WorkflowRun()` plus `WorkflowScriptContext` when you need to enter a - context-aware script checkpoint that consumes `WorkflowScriptStepContext`. +- When you need runtime metadata, add an optional named injected context + parameter: + - `WorkflowScriptContext? context` on `run(...)` + - `WorkflowExecutionContext? context` on flow steps or checkpoint methods - DTO classes are supported when they provide: - - `Map toJson()` - - `factory Type.fromJson(Map json)` or an equivalent named + - a string-keyed `toJson()` map (typically `Map`) + - `factory Type.fromJson(Map json)` or an equivalent named `fromJson` constructor - Typed task results can use the same DTO convention. - Workflow inputs, checkpoint values, and final workflow results can use the same DTO convention. The generated `PayloadCodec` persists the JSON form while workflow code continues to work with typed objects. +- Runtime detail surfaces flow `steps` and script `checkpoints` separately. diff --git a/.site/docs/core-concepts/tasks.md b/.site/docs/core-concepts/tasks.md index eafaaf01..b97a0332 100644 --- a/.site/docs/core-concepts/tasks.md +++ b/.site/docs/core-concepts/tasks.md @@ -36,9 +36,13 @@ routing, retry behavior, timeouts, and isolation. Stem ships with `TaskDefinition` so producers get compile-time checks for required arguments and result types. A definition bundles the task -name, argument encoder, optional metadata, and default `TaskOptions`. Build a -call with `.call(args)` or `TaskEnqueueBuilder` and hand it to `Stem.enqueueCall` -or `Canvas` helpers: +name, argument encoder, optional metadata, and default `TaskOptions`. For the +common path, use the direct +`definition.enqueue(stem, args)` / `definition.enqueueAndWait(...)` +helpers. When you need a reusable prebuilt request, use +`definition.buildCall(args, ...)` and hand the resulting `TaskCall` to any +`TaskResultCaller` / `TaskEnqueuer` surface. Treat `TaskCall` as the +explicit low-level transport object, not the normal happy path: ```dart file=/../packages/stem/example/docs_snippets/lib/tasks.dart#tasks-typed-definition @@ -48,6 +52,75 @@ Typed results flow through `TaskResult` when you call `Stem.waitForTask`, `Canvas.group`, `Canvas.chain`, or `Canvas.chord`. Supplying a custom `decode` callback on the task signature lets you deserialize complex objects before they reach application code. +Use `result.requiredValue()` when a completed task must have a decoded value +and you want a fail-fast read instead of manual nullable handling. +For low-level DTO waits through `Stem.waitForTask`, prefer +`decodeJson:` for plain DTOs or `decodeVersionedJson:` when the stored payload +persists an explicit schema version. +If you already have a raw `TaskStatus`, use `status.payloadJson(...)` or +`status.payloadAs(codec: ...)` to decode the whole payload DTO without a +separate cast/closure. Use `status.payloadVersionedJson(...)` when the stored +payload carries an explicit `__stemPayloadVersion`. If the whole task metadata +map is one DTO, use `status.metaJson(...)` or `status.metaAs(codec: ...)` +instead of manual `status.meta[...]` casts. +If you already have a raw `TaskResult`, use `result.payloadJson(...)` +or `result.payloadAs(codec: ...)` to decode the stored task result DTO +without another cast/closure. Use `result.payloadVersionedJson(...)` for the +same versioned DTO path on persisted task results. +If you are inspecting a low-level `TaskError`, use `error.metaJson(...)`, +`error.metaVersionedJson(...)`, or `error.metaAs(codec: ...)` instead of +manual `error.meta[...]` casts. + +If your manual task args are DTOs, prefer `TaskDefinition.json(...)` +when the type already has `toJson()`. Use `TaskDefinition.versionedJson(...)` +when the payload schema is expected to evolve and the published payload should +persist an explicit `__stemPayloadVersion`. Use `TaskDefinition.codec(...)` +when you need a custom `PayloadCodec`. Task args still need to encode to a +string-keyed map (typically `Map`) because they are published +as JSON-shaped data. For low-level name-based enqueue APIs, use +`enqueueVersionedJson(...)` for the same versioned DTO path. +If the args need a custom map encoder and still need an explicit stored schema +version, use `TaskDefinition.versionedMap(...)`. +If the args stay unversioned but the stored result carries an explicit schema +version, `TaskDefinition.json(...)` also accepts +`decodeResultVersionedJson:` plus `defaultDecodeVersion:`. + +For manual handlers, prefer the typed payload readers on the argument map +instead of repeating raw casts: + +```dart +final customerId = args.requiredValue('customerId'); +final tenant = args.valueOr('tenant', 'global'); +``` + +When the whole task arg payload is one DTO, prefer decoding it directly from +the execution context: + +```dart +final request = context.argsJson( + decode: InvoicePayload.fromJson, +); +``` + +Use `buildCall(...)` when you need an explicit low-level transport object and +provide the final headers, metadata, options, or scheduling overrides up +front. For the normal case, prefer direct `enqueue(...)` / +`enqueueAndWait(...)`. + +For tasks with no producer inputs, use `TaskDefinition.noArgs(...)` +instead. That gives you direct `enqueue(...)` / +`enqueueAndWait(...)` helpers without passing a fake empty map and the same +`waitFor(...)` decoding surface as normal typed definitions. + +If a no-arg task returns a DTO, prefer `TaskDefinition.noArgsJson(...)` when +the result already has `toJson()` and `Type.fromJson(...)`. Use +`TaskDefinition.noArgsVersionedJson(...)` when the stored result should carry +an explicit schema version, and `TaskDefinition.noArgsCodec(...)` only when +you need a custom payload codec. + +For argful manual tasks, `TaskDefinition.versionedJson(...)` also accepts +`decodeResultVersionedJson:` when the stored result should carry an explicit +schema version. ## Configuring Retries @@ -79,20 +152,39 @@ every retry signal and shows how the strategy interacts with broker timings. - `context.heartbeat()` – extend the lease to avoid timeouts. - `context.extendLease(Duration by)` – request additional processing time. - `context.progress(percent, data: {...})` – emit progress signals for UI hooks. +- `context.progressJson(percent, dto)` – emit DTO progress payloads without + hand-built maps. +- `context.progressVersionedJson(percent, dto, version: n)` – emit DTO progress + payloads with an explicit persisted schema version. +- `context.retry(...)` – request an immediate retry with optional per-call + retry policy overrides. +- when you inspect a raw `ProgressSignal`, prefer + `signal.dataJson('key', ...)`, `signal.dataVersionedJson('key', ...)`, or + `signal.dataValue('key')` for keyed reads, or + `signal.payloadJson(...)`, `signal.payloadVersionedJson(...)`, and + `signal.payloadAs(codec: ...)` when the whole progress payload is one DTO. Use the context to build idempotent handlers. Re-enqueue work, cancel jobs, or store audit details in `context.meta`. +For handler inputs, prefer the typed arg helpers on the task context when +available: + +```dart +final customerId = context.requiredArg('customerId'); +final tenant = context.argOr('tenant', 'global'); +``` + See the `packages/stem/example/task_context_mixed` demo for a runnable sample that exercises inline + isolate enqueue, TaskRetryPolicy overrides, and enqueue options. -The `packages/stem/example/task_usage_patterns.dart` sample shows in-memory TaskContext and -TaskInvocationContext patterns without external dependencies. +The `packages/stem/example/task_usage_patterns.dart` sample shows in-memory +`TaskExecutionContext` patterns without external dependencies. ### Enqueue from a running task -Use `TaskContext.enqueue`/`spawn` to schedule follow-up work with the same -defaults as `Stem.enqueue`. For isolate entrypoints, `TaskInvocationContext` -exposes the same API plus the fluent builder. +Use `TaskExecutionContext.enqueue`/`spawn` to schedule follow-up work with the +same defaults as `Stem.enqueue`. Concrete runtimes like `TaskContext` and +`TaskInvocationContext` expose the same API. ```dart file=/../packages/stem/example/docs_snippets/lib/tasks.dart#tasks-context-enqueue @@ -104,6 +196,18 @@ Inside isolate entrypoints: ``` +When a task runs inside a workflow-enabled runtime like `StemWorkflowApp`, +`TaskExecutionContext` also implements `WorkflowCaller`, so handlers and +isolate entrypoints can start or wait for +typed child workflows without dropping to raw workflow-name APIs. For manual +flows and scripts, prefer `childFlow.startAndWait(context)` or +`childWorkflowRef.startAndWait(context, params: value)` for the simple case. +Use a builder only when you need advanced overrides. + +That same shared task context also implements `WorkflowEventEmitter`, so tasks +can resume waiting workflows through `emitValue(...)` or typed `WorkflowEventRef` +instances when a workflow runtime is attached. + ### Retry from a running task Handlers can request a retry directly from the context: diff --git a/.site/docs/getting-started/developer-environment.md b/.site/docs/getting-started/developer-environment.md index 3e6a93d6..5328d30c 100644 --- a/.site/docs/getting-started/developer-environment.md +++ b/.site/docs/getting-started/developer-environment.md @@ -51,7 +51,7 @@ piece is easy to scan and reuse: ``` -### Create the Stem producer +### Create the shared client/producer ```dart title="lib/stem_bootstrap.dart" file=/../packages/stem/example/docs_snippets/lib/developer_environment.dart#dev-env-stem @@ -66,6 +66,11 @@ piece is easy to scan and reuse: Together, these steps give you access to routing, rate limiting, revoke storage, and queue configuration—all backed by Redis. +The recommended pattern here is to resolve a `StemStack` from the environment +once, build a shared `StemClient` from that stack, and then layer workers or +workflow apps on top. Manual broker/backend factory wiring is the fallback +path, not the default. + ## 3. Launch Workers, Beat, and Producers With the environment configured, run Stem components from separate terminals: @@ -115,7 +120,7 @@ pipelines and query progress from any process: ``` -Later, you can monitor status from any machine: +Later, you can monitor status from any machine with a lightweight client: ```dart file=/../packages/stem/example/docs_snippets/lib/developer_environment.dart#dev-env-status diff --git a/.site/docs/getting-started/first-steps.md b/.site/docs/getting-started/first-steps.md index fbcfae25..8749265a 100644 --- a/.site/docs/getting-started/first-steps.md +++ b/.site/docs/getting-started/first-steps.md @@ -6,8 +6,8 @@ slug: /getting-started/first-steps --- This walkthrough stays in-memory so you can learn the pipeline without running -external services. It defines a task, starts a worker, enqueues a message, then -verifies the result inside a single Dart process. +external services. It defines a task, bootstraps `StemApp`, enqueues a +message, then verifies the result inside a single Dart process. ## 1. Define a task handler @@ -19,7 +19,9 @@ Create a task handler (StemApp will register it for you): ## 2. Bootstrap the in-memory runtime -Use `StemApp` to create the broker, backend, and worker in memory: +Use `StemApp` to create the broker, backend, and worker in memory. The worker +lazy-starts on the first enqueue or wait call, so the common path does not need +an explicit `await app.start()`: ```dart file=/../packages/stem/example/docs_snippets/lib/first_steps.dart#first-steps-bootstrap diff --git a/.site/docs/getting-started/production-checklist.md b/.site/docs/getting-started/production-checklist.md index 405e2d74..1709d891 100644 --- a/.site/docs/getting-started/production-checklist.md +++ b/.site/docs/getting-started/production-checklist.md @@ -39,16 +39,16 @@ In code, wire the signer into both producers and workers: ``` - + ```dart title="lib/production_checklist.dart" file=/../packages/stem/example/docs_snippets/lib/production_checklist.dart#production-signing-registry ``` - + -```dart title="lib/production_checklist.dart" file=/../packages/stem/example/docs_snippets/lib/production_checklist.dart#production-signing-stem +```dart title="lib/production_checklist.dart" file=/../packages/stem/example/docs_snippets/lib/production_checklist.dart#production-signing-client ``` diff --git a/.site/docs/workers/programmatic-integration.md b/.site/docs/workers/programmatic-integration.md index c02036fd..c87116dd 100644 --- a/.site/docs/workers/programmatic-integration.md +++ b/.site/docs/workers/programmatic-integration.md @@ -9,6 +9,10 @@ Use Stem's Dart APIs to embed task production and processing inside your application services. This guide focuses on the two core roles: **producer** (enqueuer) and **worker**. +This page is intentionally about the lower-level embedding surface. If you want +the default happy path, prefer `StemClient`, `StemApp`, or `StemWorkflowApp` +and come here only when you need direct runtime composition. + import Tabs from '@theme/Tabs'; import TabItem from '@theme/TabItem'; @@ -40,7 +44,8 @@ import TabItem from '@theme/TabItem'; ### Tips -- Always reuse a `Stem` instance rather than creating one per request. +- Always reuse the producer runtime (`StemClient`, `StemApp`, or raw `Stem`) + rather than constructing one per request. - Use `TaskOptions` to set queue, retries, timeouts, and isolation. - Add custom metadata via the `meta` argument for observability or downstream processing. diff --git a/.site/docs/workers/worker-control.md b/.site/docs/workers/worker-control.md index 46169b8e..4e413198 100644 --- a/.site/docs/workers/worker-control.md +++ b/.site/docs/workers/worker-control.md @@ -59,6 +59,11 @@ stem worker resume --worker worker-a --queue default For a runnable lab that exercises ping/stats/revoke/shutdown against real workers, see `packages/stem/example/worker_control_lab` in the repository. +If you are working directly with low-level `ControlCommandMessage` and +`ControlReplyMessage` values in custom tooling, prefer +`payloadJson(...)` / `payloadVersionedJson(...)` and +`errorJson(...)` / `errorVersionedJson(...)` over manual map casts. + ## Autoscaling Concurrency Workers can autoscale their isolate pools between configured minimum and diff --git a/.site/docs/workflows/annotated-workflows.md b/.site/docs/workflows/annotated-workflows.md index 64d6b77c..829cc09f 100644 --- a/.site/docs/workflows/annotated-workflows.md +++ b/.site/docs/workflows/annotated-workflows.md @@ -14,30 +14,62 @@ generated file exposes: - `StemWorkflowDefinitions` - `StemTaskDefinitions` - typed workflow refs like `StemWorkflowDefinitions.userSignup` -- typed enqueue helpers like `enqueueSendEmailTyped(...)` -- typed result wait helpers like `waitForSendEmailTyped(...)` +- typed task definitions whose advanced explicit transport path uses + `TaskCall` -Wire the bundle directly into `StemWorkflowApp`: +The generated task definitions are producer-safe: `Stem.enqueueCall(...)` +remains the explicit low-level transport path, and it can publish from the +definition metadata so producer processes do not need to register the worker +handler locally just to enqueue typed task calls. + +Wire the bundle through `StemClient`: ```dart -final workflowApp = await StemWorkflowApp.fromUrl( +final client = await StemClient.fromUrl( 'memory://', module: stemModule, ); +final workflowApp = await client.createWorkflowApp(); +``` + +With `module: stemModule`, the workflow app infers the worker subscription +from the workflow queue plus the default queues declared on the bundled task +handlers. Set `workerConfig.subscription` explicitly only when you need extra +queues beyond those defaults. + +If you centralize broker/backend wiring in a `StemClient`, give the client the +bundle once and then create workflow apps without repeating it: + +```dart +final client = await StemClient.fromUrl('memory://', module: stemModule); +final workflowApp = await client.createWorkflowApp(); ``` Use the generated workflow refs when you want a single typed handle for start and wait operations: ```dart -final result = await StemWorkflowDefinitions.userSignup - .call((email: 'user@example.com')) - .startAndWaitWithApp(workflowApp); +final result = await StemWorkflowDefinitions.userSignup.startAndWait( + workflowApp, + params: 'user@example.com', +); ``` -## Two script entry styles +Annotated tasks use the same shared typed task surface: -### Direct-call style +```dart +final result = await StemTaskDefinitions.sendEmailTyped.enqueueAndWait( + workflowApp, + EmailDispatch( + email: 'typed@example.com', + subject: 'Welcome', + body: 'Codec-backed DTO payloads', + tags: ['welcome'], + ), +); +``` + +## Script context injection Use a plain `run(...)` when your annotated checkpoints only need serializable values or codec-backed DTO parameters: @@ -64,30 +96,25 @@ class UserSignupWorkflow { The generator rewrites those calls into durable checkpoint boundaries in the generated proxy class. -### Context-aware style - -Use `@WorkflowRun()` when you need to enter through `WorkflowScriptContext` so -the checkpoint body can receive `WorkflowScriptStepContext`: +When you need runtime metadata, add an optional named injected context +parameter: ```dart @WorkflowDefn(name: 'annotated.context_script', kind: WorkflowKind.script) class AnnotatedContextScriptWorkflow { - @WorkflowRun() Future> run( - WorkflowScriptContext script, String email, + {WorkflowScriptContext? context} ) async { - return script.step>( - 'enter-context-step', - (ctx) => captureContext(ctx, email), - ); + return captureContext(email); } @WorkflowStep(name: 'capture-context') Future> captureContext( - WorkflowScriptStepContext ctx, String email, + {WorkflowScriptStepContext? context} ) async { + final ctx = context!; return { 'workflow': ctx.workflow, 'runId': ctx.runId, @@ -98,11 +125,23 @@ class AnnotatedContextScriptWorkflow { } ``` -Context-aware checkpoint methods are not meant to be called directly from a -plain `run(String ...)` signature. If a called step needs -`WorkflowScriptStepContext`, enter it through `@WorkflowRun()` plus -`WorkflowScriptContext`; plain direct-call style is for steps that consume only -serializable business parameters. +This keeps one authoring model: + +- plain direct method calls are still the default +- context is added only when you need it +- the injected context is not part of the durable payload shape + +When a workflow needs to start another workflow, do it from a durable boundary: + +- `WorkflowExecutionContext` implements `WorkflowCaller`, so prefer + `ref.startAndWait(context, params: value)` inside flow steps and checkpoint + methods +- pass `ttl:`, `parentRunId:`, or `cancellationPolicy:` directly to + `ref.start(...)` / `ref.startAndWait(...)` for the normal override cases +- when you need an explicit low-level transport object, prefer + `ref.buildStart(...)` for the rarer explicit transport cases + +Avoid starting child workflows from the raw `WorkflowScriptContext` body. ## Runnable example @@ -110,23 +149,28 @@ Use `packages/stem/example/annotated_workflows` when you want a verified example that demonstrates: - `FlowContext` +- `WorkflowExecutionContext` - direct-call script checkpoints - nested annotated checkpoint calls - `WorkflowScriptContext` - `WorkflowScriptStepContext` -- `TaskInvocationContext` +- optional named context injection +- `TaskExecutionContext` - codec-backed DTO workflow checkpoints and final workflow results - typed task DTO input and result decoding +When you inspect run detail, the runtime now exposes `checkpoints` for script +workflows rather than reusing the flow-step view model. + ## DTO rules -Generated workflow/task entrypoints support required positional parameters that -are either: +Generated workflow/task entrypoints support required positional business +parameters that are either: - serializable values (`String`, numbers, bools, `List`, `Map`) - codec-backed DTO classes that provide: - - `Map toJson()` - - `factory Type.fromJson(Map json)` or an equivalent named + - a string-keyed `toJson()` map (typically `Map`) + - `factory Type.fromJson(Map json)` or an equivalent named `fromJson` constructor Typed task results can use the same DTO convention. diff --git a/.site/docs/workflows/context-and-serialization.md b/.site/docs/workflows/context-and-serialization.md index ee508c7d..e812d6c6 100644 --- a/.site/docs/workflows/context-and-serialization.md +++ b/.site/docs/workflows/context-and-serialization.md @@ -7,14 +7,23 @@ Everything else that crosses a durable boundary must be serializable. ## Supported context injection points -- flow steps: `FlowContext` +- flow steps: `FlowContext` or `WorkflowExecutionContext` - script runs: `WorkflowScriptContext` -- script checkpoints: `WorkflowScriptStepContext` -- tasks: `TaskInvocationContext` +- script checkpoints: `WorkflowScriptStepContext` or + `WorkflowExecutionContext` +- tasks: `TaskExecutionContext` Those context objects are not part of the persisted payload shape. They are injected by the runtime when the handler executes. +For annotated workflows/tasks, the preferred shape is an optional named context +parameter: + +- `Future run(String email, {WorkflowScriptContext? context})` +- `Future checkpoint(String email, {WorkflowExecutionContext? context})` +- `Future step({WorkflowExecutionContext? context})` +- `Future task(String id, {TaskExecutionContext? context})` + ## What context gives you Depending on the context type, you can access: @@ -25,11 +34,66 @@ Depending on the context type, you can access: - `stepIndex` - `iteration` - workflow params and previous results +- `param()` / `requiredParam()` for typed access to workflow start + params +- `paramsAs(codec: ...)`, `paramsJson()`, or `paramsVersionedJson()` + for decoding the full workflow start payload as one DTO +- `paramJson()`, `paramVersionedJson()`, or + `requiredParamJson()` for nested DTO params without a separate codec + constant +- `paramListJson()`, `paramListVersionedJson()`, or + `requiredParamListJson()` for lists of nested DTO params without a + separate codec constant +- `previousValue()` / `requiredPreviousValue()` for typed access to the + prior step or checkpoint result +- `previousJson()`, `previousVersionedJson()`, + `requiredPreviousJson()`, or `requiredPreviousVersionedJson()` for + prior DTO results without a separate codec constant +- `sleepUntilResumed(...)` for common sleep/retry loops +- `waitForEventValue(...)` for common event waits +- `waitForEventValueJson(...)` or + `waitForEventValueVersionedJson(...)` for DTO event waits without a + separate codec constant +- `event.awaitOn(step)` when a flow deliberately wants the lower-level + `FlowStepControl` suspend-first path on a typed event ref +- `sleepJson(...)`, `sleepVersionedJson(...)`, `awaitEventJson(...)`, + `awaitEventVersionedJson(...)`, and `FlowStepControl.awaitTopicJson(...)` + when lower-level suspension directives still need DTO metadata without a + separate codec constant +- `control.dataJson(...)`, `control.dataVersionedJson(...)`, or + `control.dataAs(codec: ...)` when you inspect a lower-level + `FlowStepControl` directly - `takeResumeData()` for event-driven resumes - `takeResumeValue(codec: ...)` for typed event-driven resumes +- `takeResumeJson(...)` or `takeResumeVersionedJson(...)` for DTO + event-driven resumes without a separate codec constant +- for read-side `...VersionedJson(...)` helpers, `defaultVersion:` is only the + fallback used when an older stored payload does not already carry + `__stemPayloadVersion` - `idempotencyKey(...)` +- direct child-workflow start helpers such as + `ref.start(context, params: value)` and + `ref.startAndWait(context, params: value)` +- direct task enqueue APIs because `WorkflowExecutionContext` and + `TaskExecutionContext` both implement `TaskEnqueuer` +- `argsAs(codec: ...)`, `argsJson()`, or `argsVersionedJson()` for + decoding the full task-arg payload as one DTO inside manual task handlers +- `argJson()`, `argVersionedJson()`, `argListJson()`, or + `argListVersionedJson()` when only one nested arg entry needs DTO decode - task metadata like `id`, `attempt`, `meta` +Child workflow starts belong in durable boundaries: + +- `ref.start(context, params: value)` inside flow steps +- `ref.startAndWait(context, params: value)` inside script checkpoints +- pass `ttl:`, `parentRunId:`, or `cancellationPolicy:` directly to those + helpers for the normal override cases +- keep `ref.buildStart(...)` for the rarer cases where you explicitly want a + reusable `WorkflowStartCall` built with its final overrides + +Do not treat the raw `WorkflowScriptContext` body as a safe place for child +starts or other replay-sensitive side effects. + ## Serializable parameter rules Supported shapes: @@ -60,9 +124,9 @@ class OrderRequest { final String id; final String customerId; - Map toJson() => {'id': id, 'customerId': customerId}; + Map toJson() => {'id': id, 'customerId': customerId}; - factory OrderRequest.fromJson(Map json) { + factory OrderRequest.fromJson(Map json) { return OrderRequest( id: json['id'] as String, customerId: json['customerId'] as String, @@ -78,14 +142,77 @@ lowers into workflow/task definitions. The same rule applies to workflow resume events: `emitValue(...)` can take a typed DTO plus a `PayloadCodec`, but the codec must still encode to a -`Map` because watcher persistence and event delivery are -map-based today. +string-keyed map because watcher persistence and event delivery are map-based +today. + +For normal DTOs that expose `toJson()` and `Type.fromJson(...)`, prefer +`PayloadCodec.json(...)`. Drop down to `PayloadCodec.map(...)` when you +need a custom map encoder or a nonstandard decode function. + +If the DTO payload shape is expected to evolve, use +`PayloadCodec.versionedJson(...)`. That persists a reserved +`__stemPayloadVersion` field beside the JSON payload and gives the decoder the +stored version so it can read older shapes explicitly. + +When a DTO evolves through multiple persisted shapes, prefer +`PayloadVersionRegistry` with `PayloadCodec.versionedJsonRegistry(...)` +so version-specific decoders live in one reusable registry instead of being +repeated inline at every call site. + +Use `PayloadCodec.versionedMap(...)` instead when the payload still needs a +custom map encoder or a nonstandard version-aware decode function. +`PayloadCodec.versionedMapRegistry(...)` gives the same reusable-registry +shape for that case. + +The same registry-backed model is available on the higher-level authoring +factories too: +- `TaskDefinition.versionedJsonRegistry(...)` +- `TaskDefinition.versionedMapRegistry(...)` +- `WorkflowRef.versionedJsonRegistry(...)` +- `WorkflowRef.versionedMapRegistry(...)` +- `WorkflowEventRef.versionedJsonRegistry(...)` +- `WorkflowEventRef.versionedMapRegistry(...)` +- `Flow.versionedJsonRegistry(...)` / `Flow.versionedMapRegistry(...)` +- `WorkflowScript.versionedJsonRegistry(...)` / + `WorkflowScript.versionedMapRegistry(...)` + +For manual flows and scripts, prefer the typed workflow param helpers before +dropping to raw map casts: + +```dart +final request = ctx.paramsJson( + decode: OrderRequest.fromJson, +); +final userId = ctx.requiredParam('userId'); +final draft = ctx.requiredParam( + 'draft', + codec: approvalDraftCodec, +); +``` + +For manual tasks, the same pattern applies to the full arg payload: + +```dart +final request = context.argsJson( + decode: OrderRequest.fromJson, +); +``` ## Practical rule -When you need context metadata, add the appropriate context parameter first. -When you need business input, make it a required positional serializable value -after the context parameter. +When you need context metadata, add the appropriate optional named context +parameter. When you need business input, make it a required positional +serializable value. + +Prefer the higher-level helpers first: + +- `sleepUntilResumed(...)` when the step/checkpoint should pause once and + continue on resume +- `waitForEventValue(...)` when the step/checkpoint is waiting on one event + +Drop down to `takeResumeData()`, `takeResumeValue(...)`, +`takeResumeJson(...)`, or `takeResumeVersionedJson(...)` only when you +need custom branching around resume payloads. The runnable `annotated_workflows` example demonstrates both the context-aware and plain serializable forms. diff --git a/.site/docs/workflows/errors-retries-and-idempotency.md b/.site/docs/workflows/errors-retries-and-idempotency.md index 27bcde75..b150caab 100644 --- a/.site/docs/workflows/errors-retries-and-idempotency.md +++ b/.site/docs/workflows/errors-retries-and-idempotency.md @@ -12,6 +12,10 @@ the runtime after resume, and the step body must tolerate replay. Use: +- `await ctx.sleepFor(duration: ...)` for the expression-style sleep path +- `await ctx.waitForEvent(topic: ...)` for the expression-style event path +- `sleepUntilResumed(...)` for simple sleep/replay loops +- `waitForEventValue(...)` for one-event suspension points - `takeResumeData()` to branch on fresh resume payloads - `idempotencyKey(...)` when a step talks to an external side-effecting system - persisted previous results instead of in-memory state diff --git a/.site/docs/workflows/flows-and-scripts.md b/.site/docs/workflows/flows-and-scripts.md index 018be993..b922a1cc 100644 --- a/.site/docs/workflows/flows-and-scripts.md +++ b/.site/docs/workflows/flows-and-scripts.md @@ -27,6 +27,20 @@ is that for script workflows those are **checkpoints**, not the plan itself. ``` +Manual flows can also derive a typed workflow ref from the definition: + +```dart +final approvalsRef = approvalsFlow.ref>( + encodeParams: (draft) => {'draft': draft}, +); +``` + +When a flow has no start params, start directly from the flow itself with +`flow.start(...)` or `flow.startAndWait(...)`. Keep +`flow.ref0().asRef.buildStart(params: ())` for the rarer cases where you want to assemble +or adjust overrides before dispatch. +Use `ref0()` only when another API specifically needs a `NoArgsWorkflowRef`. + Use `Flow` when: - the sequence of durable actions should be obvious from the definition @@ -39,6 +53,20 @@ Use `Flow` when: ``` +Manual scripts support the same pattern: + +```dart +final retryRef = retryScript.ref>( + encodeParams: (params) => params, +); +``` + +When a script has no start params, start directly from the script itself with +`retryScript.start(...)` or `retryScript.startAndWait(...)`. Keep +`retryScript.ref0().asRef.buildStart(params: ())` for the rarer cases where you want to +assemble or adjust overrides before dispatch. Use `ref0()` only when another API +specifically needs a `NoArgsWorkflowRef`. + Use `WorkflowScript` when: - you want normal Dart control flow to define the run diff --git a/.site/docs/workflows/getting-started.md b/.site/docs/workflows/getting-started.md index b83628d3..d28e1d65 100644 --- a/.site/docs/workflows/getting-started.md +++ b/.site/docs/workflows/getting-started.md @@ -14,6 +14,11 @@ This is the quickest path to a working durable workflow in Stem. Pass normal task handlers through `tasks:` if the workflow also needs to enqueue regular Stem tasks. +If you need separate workflow lanes, pass `continuationQueue:` and +`executionQueue:` into `client.createWorkflowApp(...)`. When the app is +creating the managed worker for you, those queue names are inferred into the +worker subscription automatically. + ## 2. Start the managed worker ```dart title="bin/workflows.dart" file=/../packages/stem/example/docs_snippets/lib/workflows.dart#workflows-app-start @@ -24,9 +29,13 @@ enqueue regular Stem tasks. The managed worker subscribes to the workflow orchestration queue, so you do not need to manually register the internal `stem.workflow.run` task. -If you prefer a minimal example, `startWorkflow(...)` also lazy-starts the -runtime and managed worker on first use. Explicit `start()` is still the better -choice when you want deterministic application lifecycle control. +If you prefer a minimal example, `startWorkflow(...)`, +`startWorkflowValue(...)`, and `startWorkflowJson(...)` also lazy-start the +runtime and managed worker on first use. Explicit `start()` is still the +better choice when you want deterministic application lifecycle control. Use +those name-based APIs when workflow names come from config or external input. +For workflows you define in code, prefer direct workflow helpers or generated +workflow refs. ## 3. Start a run and wait for the result @@ -50,7 +59,24 @@ Use `StemClient` when one service wants to own broker, backend, and workflow setup in one place. The clean path there is `client.createWorkflowApp(...)`. If your service already owns a `StemApp`, layer workflows on top of it with -`StemWorkflowApp.create(stemApp: ..., flows: ..., scripts: ..., tasks: ...)`. +`stemApp.createWorkflowApp(...)`. That path reuses the current worker, so the +underlying app must already subscribe to the workflow queue plus the task +queues your workflows need. + +For late registration, use the app helpers instead of reaching through the +runtime registry: + +- `registerWorkflow(...)` / `registerWorkflows(...)` +- `registerFlow(...)` / `registerFlows(...)` +- `registerScript(...)` / `registerScripts(...)` +- `registerModule(...)` / `registerModules(...)` + +If you are registering raw `WorkflowDefinition` values directly, prefer +`WorkflowDefinition.flowJson(...)` / `.scriptJson(...)` for the common DTO +path, `WorkflowDefinition.flowVersionedJson(...)` / +`.scriptVersionedJson(...)` when the stored result should carry an explicit +schema version, and `WorkflowDefinition.flowCodec(...)` / `.scriptCodec(...)` +when the result needs a custom codec. ## 5. Move to the right next page diff --git a/.site/docs/workflows/index.md b/.site/docs/workflows/index.md index ea4c9fc5..811554ac 100644 --- a/.site/docs/workflows/index.md +++ b/.site/docs/workflows/index.md @@ -75,6 +75,7 @@ final workflowApp = await client.createWorkflowApp( ); ``` -If your service already owns a `StemApp`, reuse it directly with -`StemWorkflowApp.create(stemApp: ..., flows: ..., scripts: ..., tasks: ...)` -rather than bootstrapping a second broker/backend/task boundary. +If your service already owns a `StemApp`, layer workflows on top of it with +`stemApp.createWorkflowApp(...)`. That path reuses the existing worker, so the +app must already subscribe to the workflow queue plus any task queues the +workflows need. diff --git a/.site/docs/workflows/starting-and-waiting.md b/.site/docs/workflows/starting-and-waiting.md index 67ec8862..706b3cfb 100644 --- a/.site/docs/workflows/starting-and-waiting.md +++ b/.site/docs/workflows/starting-and-waiting.md @@ -3,7 +3,7 @@ title: Starting and Waiting --- Workflow runs are started through the runtime, through `StemWorkflowApp`, or -through generated workflow refs. +through typed workflow refs. ## Start by workflow name @@ -15,14 +15,133 @@ Use `params:` to supply workflow input and `WorkflowCancellationPolicy` to cap wall-clock runtime or maximum suspension time. +That low-level API is useful when workflow names come from config or external +input. For workflows you define in code, prefer a typed workflow ref instead. + +## Start through manual workflow refs + +Manual `Flow(...)` and `WorkflowScript(...)` definitions can derive a typed ref +without repeating the workflow-name string: + +```dart +const approvalDraftCodec = PayloadCodec.json( + decode: ApprovalDraft.fromJson, + typeName: 'ApprovalDraft', +); + +final approvalsRef = approvalsFlow.refCodec( + paramsCodec: approvalDraftCodec, +); + +final runId = await approvalsRef.start( + workflowApp, + params: const ApprovalDraft(documentId: 'doc-42'), +); + +final result = await approvalsRef.waitFor(workflowApp, runId); +``` + +Use this path when you want the same typed start/wait surface as generated +workflow refs, but the workflow itself is still hand-written. + +When you want to add advanced start options, keep using the direct typed ref +helpers: + +```dart +final runId = await approvalsRef.start( + workflowApp, + params: const ApprovalDraft(documentId: 'doc-42'), + parentRunId: 'parent-run', + ttl: const Duration(hours: 1), + cancellationPolicy: const WorkflowCancellationPolicy( + maxRuntime: Duration(minutes: 10), + ), +); +``` + +`refJson(...)` is the shortest manual DTO path when the params already have +`toJson()`, or when the final result also needs a `Type.fromJson(...)` +decoder. Use `refVersionedJson(...)` when the persisted start payload should +carry an explicit `__stemPayloadVersion`. Use `refCodec(...)` when you need a +custom `PayloadCodec`. Workflow params still need to encode to a +string-keyed map (typically `Map`) because they are stored as +JSON-shaped data. +If the params need a custom map encoder and still need an explicit stored +schema version, use `refVersionedMap(...)` / `WorkflowRef.versionedMap(...)`. + +Inside manual flow steps and script checkpoints, prefer +`ctx.param()` / `ctx.requiredParam()` for workflow start params and +`ctx.previousValue()` / `ctx.requiredPreviousValue()` over repeating raw +`previousResult as ...` casts. + +If the params stay unversioned but the stored result carries an explicit schema +version, `refJson(...)` / `WorkflowRef.json(...)` also accept +`decodeResultVersionedJson:` plus `defaultDecodeVersion:`. + +If a manual flow or script returns a DTO, prefer `Flow.json(...)` or +`WorkflowScript.json(...)` in the common `toJson()` / `Type.fromJson(...)` +case. Use `Flow.versionedJson(...)` / `WorkflowScript.versionedJson(...)` when +the stored result should carry an explicit schema version. Use `Flow.codec(...)` +or `WorkflowScript.codec(...)` when the result needs a custom payload codec. +If the result still needs a custom map encoder plus an explicit stored schema +version, use `Flow.versionedMap(...)` / `WorkflowScript.versionedMap(...)`. +For manual typed refs, `refVersionedJson(...)` / `WorkflowRef.versionedJson(...)` +also accept `decodeResultVersionedJson:` when the stored result should carry an +explicit schema version. +When the persisted workflow result or suspension payload carries an explicit +`__stemPayloadVersion`, use `workflowResult.payloadVersionedJson(...)`, +`runState.resultVersionedJson(...)`, or +`runState.suspensionPayloadVersionedJson(...)` on the low-level snapshots. +Those read-side helpers take `defaultVersion:` as the fallback for older +payloads that do not yet carry a stored version marker. + +For workflows without start params, start directly from the flow or script +itself with `start(...)` or `startAndWait(...)`. When you need an explicit +low-level transport object for `startWorkflowCall(...)`, build it with +`ref0().asRef.buildStart(params: ())`. Treat `WorkflowStartCall` as the explicit low-level +transport object, not the normal happy path. Use `ref0()` when another API +specifically needs a `NoArgsWorkflowRef`. + +When you need an explicit start request, prefer `ref.buildStart(...)` with the +final overrides you already know. + ## Wait for completion -`waitForCompletion` polls the store until the run finishes or the caller -times out. +For workflows defined in code, prefer direct workflow helpers or typed refs +like `ordersFlow.startAndWait(...)` and +`StemWorkflowDefinitions.orders.startAndWait(...)`. + +`waitForCompletion` is the low-level completion API for name-based runs. It +polls the store until the run finishes or the caller times out. For DTO +results, prefer `decodeJson:` for plain DTOs or `decodeVersionedJson:` when +the persisted payload carries an explicit schema version. +If you already have a raw `WorkflowResult`, use +`result.payloadJson(...)` or `result.payloadAs(codec: ...)` to decode the +stored workflow result without another cast/closure. +If you are inspecting the underlying `RunState` directly, use +`state.paramsJson(...)`, `state.paramsAs(codec: ...)`, +`state.resultJson(...)`, `state.resultAs(codec: ...)`, +`state.resultVersionedJson(...)`, `state.suspensionPayloadJson(...)`, +`state.suspensionPayloadVersionedJson(...)`, +`state.lastErrorJson(...)`, `state.runtimeJson(...)`, +`state.cancellationDataJson(...)`, or `state.suspensionPayloadAs(codec: ...)` +instead of manual raw-map casts. +Workflow run detail views expose the same convenience surface via +`runView.paramsJson(...)`, `runView.paramsAs(codec: ...)`, +`runView.resultJson(...)`, `runView.resultAs(codec: ...)`, +`runView.resultVersionedJson(...)`, `runView.suspensionPayloadJson(...)`, +`runView.suspensionPayloadVersionedJson(...)`, `runView.lastErrorJson(...)`, +`runView.runtimeJson(...)`, and `runView.suspensionPayloadAs(codec: ...)`. +Checkpoint entries from `viewCheckpoints(...)` and +`WorkflowCheckpointView.fromEntry(...)` expose the same surface via +`entry.valueJson(...)`, `entry.valueVersionedJson(...)`, and +`entry.valueAs(codec: ...)`. Use the returned `WorkflowResult` when you need: - `value` for a completed run +- `requiredValue()` when completion is already guaranteed and you want a + fail-fast typed read - `status` for partial progress - `timedOut` to decide whether to keep polling @@ -32,20 +151,45 @@ When you use `stem_builder`, generated workflow refs remove the raw workflow-name strings and give you one typed handle for both start and wait: ```dart -final result = await StemWorkflowDefinitions.userSignup - .call((email: 'user@example.com')) - .startAndWaitWithApp(workflowApp); +final result = await StemWorkflowDefinitions.userSignup.startAndWait( + workflowApp, + params: 'user@example.com', +); ``` -The same definitions work on `WorkflowRuntime` through -`.startWithRuntime(runtime)`. +The same definitions work on `WorkflowRuntime` by passing the runtime as the +`WorkflowCaller`: + +```dart +final runId = await StemWorkflowDefinitions.userSignup.start( + runtime, + params: 'user@example.com', +); +``` + +When you already have a `WorkflowCaller` like `FlowContext`, +`WorkflowScriptStepContext`, `WorkflowRuntime`, or `StemWorkflowApp`, prefer +the direct typed ref helpers, even when you need start overrides: + +```dart +final result = await StemWorkflowDefinitions.userSignup.startAndWait( + context, + params: 'user@example.com', + ttl: const Duration(hours: 1), + timeout: const Duration(seconds: 5), +); +``` If you still need the run identifier for inspection or operator tooling, read it from `result.runId`. +Keep `ref.buildStart(...)` for the rarer cases where you need to assemble or +adjust an explicit start request before dispatch. + ## Parent runs and TTL -`WorkflowRuntime.startWorkflow(...)` also supports: +`WorkflowRuntime.startWorkflow(...)`, `startWorkflowValue(...)`, +`startWorkflowJson(...)`, and `startWorkflowVersionedJson(...)` also support: - `parentRunId` when one workflow needs to track provenance from another run - `ttl` when you want run metadata to expire after a bounded retention period diff --git a/.site/docs/workflows/suspensions-and-events.md b/.site/docs/workflows/suspensions-and-events.md index f3cd9e70..943e5fd4 100644 --- a/.site/docs/workflows/suspensions-and-events.md +++ b/.site/docs/workflows/suspensions-and-events.md @@ -12,10 +12,21 @@ different worker. periodically scans due runs and re-enqueues the internal workflow task when the sleep expires. +For the common "sleep once, continue on resume" case, prefer the higher-level +helper: + +```dart +await ctx.sleepFor(duration: const Duration(milliseconds: 200)); +``` + ## Await external events `awaitEvent(topic, deadline: ...)` records a durable watcher. External code can resume those runs through the runtime API by emitting a payload for the topic. +When you inspect watcher entries directly, use `watcher.dataJson(...)` or +`watcher.dataAs(codec: ...)` when the full watcher metadata maps to one DTO. +If only the nested watcher payload is a DTO, use `watcher.payloadJson(...)` or +`watcher.payloadAs(codec: ...)` instead of manual raw-map casts. Typical flow: @@ -25,24 +36,50 @@ Typical flow: `WorkflowRuntime.emitValue(...)` (or an app/service wrapper around it) with a payload 4. the runtime resumes the run and exposes the payload through - `takeResumeData()` or `takeResumeValue(codec: ...)` + `waitForEvent(...)`, `event.wait(ctx)`, or the lower-level + `takeResumeData()` / `takeResumeValue(codec: ...)` + +For the common "wait for one event and continue" case, prefer: + +```dart +final payload = await ctx.waitForEventJson( + topic: 'orders.payment.confirmed', + decode: PaymentConfirmed.fromJson, +); +``` ## Emit resume events -Use `WorkflowRuntime.emit(...)` / `WorkflowRuntime.emitValue(...)` (or the app -wrapper `workflowApp.emitValue(...)`) instead of hand-editing store state: +Use `WorkflowRuntime.emit(...)` / `WorkflowRuntime.emitJson(...)` / +`WorkflowRuntime.emitVersionedJson(...)` / `WorkflowRuntime.emitValue(...)` +(or the app wrappers `workflowApp.emitJson(...)` / +`workflowApp.emitVersionedJson(...)` / `workflowApp.emitValue(...)`) instead +of hand-editing store state: ```dart -await workflowApp.emitValue( +await workflowApp.emitJson( 'orders.payment.confirmed', const PaymentConfirmed(paymentId: 'pay_42', approvedBy: 'gateway'), - codec: paymentConfirmedCodec, ); ``` -Typed event payloads still serialize to the existing `Map` -wire format. `emitValue(...)` is a DTO/codec convenience layer, not a new -transport shape. +Typed event payloads still serialize to a string-keyed JSON-like map. +`emitJson(...)`, `emitVersionedJson(...)`, and `emitValue(...)` are +DTO/codec convenience layers, not a new transport shape. + +When the topic and codec travel together in your codebase, prefer +`WorkflowEventRef.json(...)` for normal DTO payloads, +`WorkflowEventRef.versionedJson(...)` when the payload schema should carry +an explicit `__stemPayloadVersion`, `WorkflowEventRef.versionedMap(...)` +when the payload needs a custom map encoder plus stored schema version, and +keep `event.emit(emitter, dto)` as the happy path. +Pair that with `await event.wait(ctx)`. If you are writing a flow and +deliberately want the lower-level `FlowStepControl` path, use +`event.awaitOn(step)` instead of dropping back to a raw topic string. +For low-level sleep/event directives that still need DTO metadata, use +`step.sleepJson(...)`, `step.sleepVersionedJson(...)`, +`step.awaitEventJson(...)`, `step.awaitEventVersionedJson(...)`, or +`FlowStepControl.awaitTopicJson(...)` instead of hand-built maps. ## Inspect waiting runs diff --git a/.site/docs/workflows/troubleshooting.md b/.site/docs/workflows/troubleshooting.md index 60769fb2..e65c52ec 100644 --- a/.site/docs/workflows/troubleshooting.md +++ b/.site/docs/workflows/troubleshooting.md @@ -25,12 +25,12 @@ Check: - the topic passed to `WorkflowRuntime.emit(...)` / `emitValue(...)` or `workflowApp.emitValue(...)` matches the one passed to `awaitEvent(...)` - the run is still waiting on that topic -- the payload encodes to a `Map` +- the payload encodes to a string-keyed map such as `Map` ## Serialization failures Do not pass arbitrary Dart objects across workflow or task boundaries. Encode -domain objects as `Map` or `List` first. +domain objects as string-keyed JSON-like maps or lists first. ## Logs only show `stem.workflow.run` diff --git a/packages/dashboard/lib/src/server.dart b/packages/dashboard/lib/src/server.dart index 6d66b1f6..3953f933 100644 --- a/packages/dashboard/lib/src/server.dart +++ b/packages/dashboard/lib/src/server.dart @@ -550,9 +550,10 @@ Future _renderPage( final workflowRun = page == DashboardPage.taskDetail && runId != null ? await service.fetchWorkflowRun(runId) : null; - final workflowSteps = page == DashboardPage.taskDetail && runId != null - ? await service.fetchWorkflowSteps(runId) - : const []; + final workflowCheckpoints = + page == DashboardPage.taskDetail && runId != null + ? await service.fetchWorkflowCheckpoints(runId) + : const []; final content = buildPageContent( page: page, @@ -562,7 +563,7 @@ Future _renderPage( taskDetail: taskDetail, runTimeline: runTimeline, workflowRun: workflowRun, - workflowSteps: workflowSteps, + workflowCheckpoints: workflowCheckpoints, auditEntries: page == DashboardPage.search || page == DashboardPage.audit ? state.auditEntries : const [], diff --git a/packages/dashboard/lib/src/services/models.dart b/packages/dashboard/lib/src/services/models.dart index 28c888bb..a73a0bf6 100644 --- a/packages/dashboard/lib/src/services/models.dart +++ b/packages/dashboard/lib/src/services/models.dart @@ -514,7 +514,7 @@ class DashboardWorkflowRunSummary { const DashboardWorkflowRunSummary({ required this.runId, required this.workflowName, - required this.lastStep, + required this.lastCheckpoint, required this.total, required this.queued, required this.running, @@ -530,8 +530,8 @@ class DashboardWorkflowRunSummary { /// Workflow name, when available. final String workflowName; - /// Most recent step marker, when available. - final String? lastStep; + /// Most recent checkpoint marker, when available. + final String? lastCheckpoint; /// Total sampled statuses for this run. final int total; @@ -762,9 +762,9 @@ class DashboardWorkflowRunSnapshot { } /// Projection of a persisted workflow checkpoint. -class DashboardWorkflowStepSnapshot { +class DashboardWorkflowCheckpointSnapshot { /// Creates a workflow checkpoint snapshot. - const DashboardWorkflowStepSnapshot({ + const DashboardWorkflowCheckpointSnapshot({ required this.name, required this.position, required this.value, @@ -772,8 +772,10 @@ class DashboardWorkflowStepSnapshot { }); /// Builds a workflow checkpoint snapshot from [WorkflowStepEntry]. - factory DashboardWorkflowStepSnapshot.fromEntry(WorkflowStepEntry entry) { - return DashboardWorkflowStepSnapshot( + factory DashboardWorkflowCheckpointSnapshot.fromEntry( + WorkflowStepEntry entry, + ) { + return DashboardWorkflowCheckpointSnapshot( name: entry.name, position: entry.position, value: entry.value, @@ -906,7 +908,7 @@ class _DashboardWorkflowSummaryBuilder { final String runId; String _workflowName = 'workflow'; - String? _lastStep; + String? _lastCheckpoint; var _total = 0; var _queued = 0; var _running = 0; @@ -921,7 +923,7 @@ class _DashboardWorkflowSummaryBuilder { _workflowName = task.workflowName!; } if (task.workflowStep != null && task.workflowStep!.isNotEmpty) { - _lastStep = task.workflowStep; + _lastCheckpoint = task.workflowStep; } if (task.state == TaskState.queued || task.state == TaskState.retried) { _queued += 1; @@ -939,7 +941,7 @@ class _DashboardWorkflowSummaryBuilder { return DashboardWorkflowRunSummary( runId: runId, workflowName: _workflowName, - lastStep: _lastStep, + lastCheckpoint: _lastCheckpoint, total: _total, queued: _queued, running: _running, diff --git a/packages/dashboard/lib/src/services/stem_service.dart b/packages/dashboard/lib/src/services/stem_service.dart index b90ded04..08d584cf 100644 --- a/packages/dashboard/lib/src/services/stem_service.dart +++ b/packages/dashboard/lib/src/services/stem_service.dart @@ -38,7 +38,8 @@ abstract class DashboardDataSource { Future fetchWorkflowRun(String runId); /// Fetches persisted workflow checkpoints, if a workflow store is available. - Future> fetchWorkflowSteps(String runId); + Future> + fetchWorkflowCheckpoints(String runId); /// Enqueues a task request through the backing broker. Future enqueueTask(EnqueueRequest request); @@ -285,7 +286,7 @@ class StemDashboardService implements DashboardDataSource { } @override - Future> fetchWorkflowSteps( + Future> fetchWorkflowCheckpoints( String runId, ) async { final store = _workflowStore; @@ -297,7 +298,7 @@ class StemDashboardService implements DashboardDataSource { try { final steps = await store.listSteps(trimmed); return steps - .map(DashboardWorkflowStepSnapshot.fromEntry) + .map(DashboardWorkflowCheckpointSnapshot.fromEntry) .toList(growable: false) ..sort((a, b) => a.position.compareTo(b.position)); } on Object { diff --git a/packages/dashboard/lib/src/ui/content.dart b/packages/dashboard/lib/src/ui/content.dart index b88cc745..05ecfa18 100644 --- a/packages/dashboard/lib/src/ui/content.dart +++ b/packages/dashboard/lib/src/ui/content.dart @@ -24,7 +24,7 @@ String buildPageContent({ DashboardTaskStatusEntry? taskDetail, List runTimeline = const [], DashboardWorkflowRunSnapshot? workflowRun, - List workflowSteps = const [], + List workflowCheckpoints = const [], List auditEntries = const [], DashboardThroughput? throughput, List events = const [], @@ -53,7 +53,7 @@ String buildPageContent({ taskDetail, runTimeline, workflowRun, - workflowSteps, + workflowCheckpoints, ); case DashboardPage.failures: return buildFailuresContent(taskStatuses, failuresOptions); diff --git a/packages/dashboard/lib/src/ui/overview.dart b/packages/dashboard/lib/src/ui/overview.dart index 5aa140ff..86cf62a2 100644 --- a/packages/dashboard/lib/src/ui/overview.dart +++ b/packages/dashboard/lib/src/ui/overview.dart @@ -238,7 +238,7 @@ OverviewSections buildOverviewSections( ${escapeHtml(run.runId)} ${escapeHtml(run.workflowName)} - ${escapeHtml(run.lastStep ?? '—')} + ${escapeHtml(run.lastCheckpoint ?? '—')} ${formatInt(run.queued)} ${formatInt(run.running)} ${formatInt(run.succeeded)} diff --git a/packages/dashboard/lib/src/ui/task_detail.dart b/packages/dashboard/lib/src/ui/task_detail.dart index e3d9f3c0..f641fb99 100644 --- a/packages/dashboard/lib/src/ui/task_detail.dart +++ b/packages/dashboard/lib/src/ui/task_detail.dart @@ -12,7 +12,7 @@ String buildTaskDetailContent( DashboardTaskStatusEntry? task, List runTimeline, DashboardWorkflowRunSnapshot? workflowRun, - List workflowSteps, + List workflowCheckpoints, ) { if (task == null) { return ''' @@ -68,7 +68,7 @@ String buildTaskDetailContent( Updated${formatDateTime(task.updatedAt)} Run ID${task.runId == null ? '' : '${escapeHtml(task.runId!)}'} Workflow${task.workflowName == null ? '' : escapeHtml(task.workflowName!)} - Workflow Step${task.workflowStep == null ? '' : escapeHtml(task.workflowStep!)} + Workflow Checkpoint${task.workflowStep == null ? '' : escapeHtml(task.workflowStep!)} @@ -113,14 +113,14 @@ String buildTaskDetailContent( -${buildWorkflowSection(task, workflowRun, workflowSteps, timeline)} +${buildWorkflowSection(task, workflowRun, workflowCheckpoints, timeline)} '''; } String buildWorkflowSection( DashboardTaskStatusEntry task, DashboardWorkflowRunSnapshot? workflowRun, - List workflowSteps, + List workflowCheckpoints, List timeline, ) { if (task.runId == null) { @@ -182,11 +182,11 @@ String buildWorkflowSection( - ${workflowSteps.isEmpty ? ''' + ${workflowCheckpoints.isEmpty ? ''' - No persisted workflow step checkpoints found. + No persisted workflow checkpoints found. -''' : workflowSteps.map((step) => ''' +''' : workflowCheckpoints.map((step) => ''' ${escapeHtml(step.name)} ${formatInt(step.position)} @@ -207,7 +207,7 @@ String buildWorkflowSection( Task ID Task - Step + Checkpoint State Attempt Updated diff --git a/packages/dashboard/lib/src/ui/workflows.dart b/packages/dashboard/lib/src/ui/workflows.dart index db3b97ea..4308b983 100644 --- a/packages/dashboard/lib/src/ui/workflows.dart +++ b/packages/dashboard/lib/src/ui/workflows.dart @@ -83,7 +83,7 @@ String buildWorkflowsContent({ ${escapeHtml(entry.runId)} ${escapeHtml(entry.workflowName)} - ${escapeHtml(entry.lastStep ?? '—')} + ${escapeHtml(entry.lastCheckpoint ?? '—')} ${formatInt(entry.queued)} ${formatInt(entry.running)} ${formatInt(entry.succeeded)} diff --git a/packages/dashboard/pubspec.yaml b/packages/dashboard/pubspec.yaml index dc79877b..03421099 100644 --- a/packages/dashboard/pubspec.yaml +++ b/packages/dashboard/pubspec.yaml @@ -12,7 +12,7 @@ dependencies: ormed: ^0.2.0 routed: ^0.3.2 routed_hotwire: ^0.1.2 - stem: ^0.1.0 + stem: ^0.2.0 stem_cli: ^0.1.0 stem_postgres: ^0.1.0 stem_redis: ^0.1.0 diff --git a/packages/dashboard/test/dashboard_browser_test.dart b/packages/dashboard/test/dashboard_browser_test.dart index a073ba29..d64c4f73 100644 --- a/packages/dashboard/test/dashboard_browser_test.dart +++ b/packages/dashboard/test/dashboard_browser_test.dart @@ -105,7 +105,7 @@ class _FakeDashboardService implements DashboardDataSource { null; @override - Future> fetchWorkflowSteps( + Future> fetchWorkflowCheckpoints( String runId, ) async => const []; diff --git a/packages/dashboard/test/dashboard_state_poll_test.dart b/packages/dashboard/test/dashboard_state_poll_test.dart index 6482ced1..85a3b2b0 100644 --- a/packages/dashboard/test/dashboard_state_poll_test.dart +++ b/packages/dashboard/test/dashboard_state_poll_test.dart @@ -47,7 +47,7 @@ class _FailingPollService implements DashboardDataSource { null; @override - Future> fetchWorkflowSteps( + Future> fetchWorkflowCheckpoints( String runId, ) async => const []; @@ -114,7 +114,7 @@ class _BacklogOnlyService implements DashboardDataSource { null; @override - Future> fetchWorkflowSteps( + Future> fetchWorkflowCheckpoints( String runId, ) async => const []; diff --git a/packages/dashboard/test/dashboard_state_property_test.dart b/packages/dashboard/test/dashboard_state_property_test.dart index 405079c0..9e5e2f18 100644 --- a/packages/dashboard/test/dashboard_state_property_test.dart +++ b/packages/dashboard/test/dashboard_state_property_test.dart @@ -100,7 +100,7 @@ class _SequenceDashboardService implements DashboardDataSource { null; @override - Future> fetchWorkflowSteps( + Future> fetchWorkflowCheckpoints( String runId, ) async => const []; diff --git a/packages/dashboard/test/server_test.dart b/packages/dashboard/test/server_test.dart index a0f13451..cbe09087 100644 --- a/packages/dashboard/test/server_test.dart +++ b/packages/dashboard/test/server_test.dart @@ -93,7 +93,7 @@ class _RecordingService implements DashboardDataSource { null; @override - Future> fetchWorkflowSteps( + Future> fetchWorkflowCheckpoints( String runId, ) async => const []; diff --git a/packages/stem/CHANGELOG.md b/packages/stem/CHANGELOG.md index bc623da8..cf282845 100644 --- a/packages/stem/CHANGELOG.md +++ b/packages/stem/CHANGELOG.md @@ -1,57 +1,32 @@ # Changelog -## 0.1.1 +## 0.2.0 -- Added `StemModule`, typed `WorkflowRef`/`WorkflowStartCall` helpers, and bundle-first `StemWorkflowApp`/`StemClient` composition for generated workflow and task definitions. -- Added `PayloadCodec`, typed workflow resume helpers, codec-backed workflow checkpoint/result persistence, typed task result waiting, and typed workflow event emit helpers for DTO-shaped payloads. -- Added workflow manifests, runtime metadata views, and run/step drilldown APIs - for inspecting workflow definitions and persisted execution state. -- Clarified the workflow authoring model by distinguishing flow steps from - script checkpoints in manifests, docs, dashboard wording, and generated - workflow output. -- Improved workflow store contracts and runtime compatibility for caller- - supplied run ids and persisted runtime metadata attached to workflow params. -- Restored the deprecated `SimpleTaskRegistry` alias for source compatibility - and fixed workflow continuation routing to honor persisted queue metadata - when resuming suspended runs after runtime configuration changes. -- Added `tasks:`-first wiring across `Stem`, `Worker`, `Canvas`, and - `StemWorkflowApp`, removing the need for manual default-registry setup in - normal application code and examples. -- Renamed the default in-memory task registry surface to - `InMemoryTaskRegistry` and refreshed docs/examples to teach `tasks: [...]` - rather than explicit registry construction. -- Improved workflow logging with richer run/step context on worker lifecycle - lines plus enqueue/suspend/fail/complete runtime events. -- Exported logging types from `package:stem/stem.dart`, including `Level`, - `Logger`, and `Context`. -- Added an end-to-end ecommerce workflow example using mixed annotated/manual - workflows, `StemWorkflowApp`, and Ormed-backed SQLite models/migrations. -- Expanded span attribution across enqueue/consume/execute with task identity, - queue, worker, host, lineage, namespace, and workflow step metadata - (`run_id`, `step`, `step_id`, `step_index`, `step_attempt`, `iteration`). -- Improved worker retry republish behavior to preserve optional payload signing - when retrying deliveries. -- Added workflow metadata quality-of-life getters and watcher/run-state helpers - to make workflow introspection easier from task metadata. -- Strengthened tracing and workflow-related test coverage for metadata - propagation and contract behavior. -- Expanded the microservice example with richer workload generation, queue - diversity, updated scheduler/demo flows, and full local observability wiring - for Jaeger/Prometheus/Grafana through nginx. -- Improved bootstrap DX with explicit fail-fast errors across broker/backend/ - workflow/schedule/lock/revoke resolution paths in `StemStack.fromUrl`, - including actionable hints when adapters support a URL but do not implement - the requested store kind. -- Refreshed docs to lead with `StemClient` and document adapter-focused Task - workflows. -- Aligned in-memory broker and result backend semantics with shared adapter - contracts, including broadcast fan-out behavior for in-memory broker tests. -- Added Taskfile support for package-scoped test orchestration. -- Added Taskfile-based workflows for complex examples (microservice, encrypted - payloads, signing key rotation, security profiles, and Postgres TLS), - including secret/certificate bootstrap and binary build/run helpers. -- Added shared logger injection via `setStemLogger` and reusable structured - context helpers for consistent logging metadata across core components. +- Added `StemClient.fromStack(...)` and `StemStack.createClient(...)` so + adapter-resolved broker/backend stacks have the same direct bootstrap path + as the higher-level app helpers. +- Narrowed the public task and workflow invocation APIs around direct + `enqueue(...)` / `enqueueAndWait(...)` and `start(...)` / `startAndWait(...)` + calls, with explicit transport objects left as the advanced low-level path. +- Removed duplicate transport helpers and wrapper builder entrypoints such as + `.call(...)`, `prepareStart(...)`, `prepareEnqueue(...)`, builder dispatch + methods, and `copyWith(...)` on transport objects. +- Added shared execution-context interfaces for workflows and tasks so manual + handlers and checkpoints can use one typed context surface instead of several + partially overlapping ones. +- Added expression-style suspension and event APIs for workflows, plus direct + typed event emit/wait helpers on workflow event refs. +- Added module-first bootstrap improvements including module merge/combine, + inferred worker subscriptions, queue/subscription inspection helpers, and + shared app/client workflow bootstrap helpers. +- Expanded manual serialization support with `json(...)`, `versionedJson(...)`, + `versionedMap(...)`, registry-backed versioned factories, and codec-backed + low-level publish/start/emit helpers for tasks, workflows, and queue events. +- Added broad typed decode helpers across runtime, inspection, signal, queue, + status, and context surfaces so DTO reads no longer require raw map casts in + the common path. +- Refreshed examples and docs to use the narrowed happy-path APIs and to treat + transport objects as explicit advanced APIs rather than peer entrypoints. ## 0.1.0 diff --git a/packages/stem/README.md b/packages/stem/README.md index a8d0fbef..3c215742 100644 --- a/packages/stem/README.md +++ b/packages/stem/README.md @@ -1,70 +1,78 @@ +

+ Stem Logo +

+ [![pub package](https://img.shields.io/pub/v/stem.svg)](https://pub.dev/packages/stem) [![Dart](https://img.shields.io/badge/dart-%3E%3D3.9.2-blue.svg)](https://dart.dev/) [![License](https://img.shields.io/badge/license-MIT-purple.svg)](LICENSE) -[![Build Status](https://github.com/kingwill101/stem/workflows/ci/badge.svg)](https://github.com/kingwill101/stem/actions) -[![Coverage](https://img.shields.io/endpoint?url=https://raw.githubusercontent.com/kingwill101/stem/main/packages/stem/coverage/coverage.json)](https://github.com/kingwill101/stem/actions/workflows/stem.yaml) [![Buy Me A Coffee](https://img.shields.io/badge/Buy%20Me%20A%20Coffee-support-yellow.svg)](https://www.buymeacoffee.com/kingwill101) -

- Stem Logo -

- # Stem - Stem is a Dart-native background job platform. It gives you Celery-style -task execution with a Dart-first API, Redis Streams integration, retries, -scheduling, observability, and security tooling-all without leaving the Dart -ecosystem. +Stem is a Dart-first background job and workflow platform: enqueue work, run workers, and orchestrate durable workflows. -## Install +For full docs, API references, and in-depth guides, visit +https://kingwill101.github.io/stem. -```bash -dart pub add stem # core runtime APIs -dart pub add stem_redis # Redis broker + result backend -dart pub add stem_postgres # (optional) Postgres broker + backend -dart pub add stem_sqlite # (optional) SQLite broker + backend -dart pub add -d stem_builder # (optional) workflow/task code generator -dart pub global activate stem_cli -``` -Add the pub-cache bin directory to your `PATH` so the `stem_cli` tool is available: + +## Features + +- **Task pipeline** - enqueue with delays, priorities, idempotency helpers, and retries. +- **Workers** - isolate pools with soft/hard time limits, autoscaling, and remote control (`stem worker ping|revoke|shutdown`). +- **Scheduling** - Beat-style scheduler with interval/cron/solar/clocked entries and drift tracking. +- **Workflows** - Durable `Flow` runtime with pluggable stores (in-memory, + Redis, Postgres, SQLite) and CLI introspection via `stem wf`. +- **Observability** - Dartastic OpenTelemetry metrics/traces, heartbeats, CLI inspection (`stem observe`, `stem dlq`). +- **Security** - Payload signing (HMAC or Ed25519), TLS automation scripts, revocation persistence. +- **Adapters** - In-memory drivers included here; Redis Streams and Postgres adapters ship via the `stem_redis` and `stem_postgres` packages. +- **Specs & tooling** - OpenSpec change workflow, quality gates (see `example/quality_gates`), chaos/regression suites. + +## Install ```bash -export PATH="$HOME/.pub-cache/bin:$PATH" -stem --help +dart pub add stem +# Optional adapters +dart pub add stem_redis # Redis broker/backend +dart pub add stem_postgres # Postgres broker/backend +dart pub add stem_sqlite # SQLite broker/backend +dart pub add -d stem_builder # for annotations/codegen (optional) +dart pub add -d stem_cli # for CLI tooling ``` -## Quick Start -### StemClient entrypoint +## Examples -Use a single entrypoint to share broker/backend/config between workers and -workflow apps. +`StemApp` and `StemWorkflowApp` shortcut helpers lazily start their managed +worker by default. Pass `allowWorkerAutoStart: false` when you want producer +or orchestration shortcuts without starting that worker in the background, +then call `start()` explicitly when you're ready. `StemWorkflowApp` also +exposes `startRuntime()` and `startWorker()` when you want those lifecycles +split. -```dart -import 'dart:async'; -import 'package:stem/stem.dart'; +### Minimal in-memory task + worker -class HelloTask implements TaskHandler { - @override - String get name => 'demo.hello'; +```dart +import "dart:async"; +import "package:stem/stem.dart"; +class HelloTask extends TaskHandler { @override - TaskOptions get options => const TaskOptions(queue: 'default'); + String get name => "demo.hello"; @override Future call(TaskContext context, Map args) async { - print('Hello from StemClient'); + final name = args.valueOr("name", "world"); + print("Hello $name"); } } Future main() async { final client = await StemClient.inMemory(tasks: [HelloTask()]); - final worker = await client.createWorker(); unawaited(worker.start()); - await client.stem.enqueue('demo.hello'); + await client.enqueueValue("demo.hello", const {"name": "Stem"}); await Future.delayed(const Duration(seconds: 1)); await worker.shutdown(); @@ -72,837 +80,351 @@ Future main() async { } ``` -For persistent adapters, keep `StemClient` as the entrypoint and resolve -broker/backend wiring from a URL: - -```dart -import 'package:stem/stem.dart'; -import 'package:stem_redis/stem_redis.dart'; - -final client = await StemClient.create( - broker: redisBrokerFactory('redis://localhost:6379'), - backend: redisResultBackendFactory('redis://localhost:6379/1'), - tasks: [HelloTask()], -); -``` - -or use the lower-boilerplate URL helper: +### Reusable stack from URL (Redis) ```dart -import 'package:stem/stem.dart'; -import 'package:stem_redis/stem_redis.dart'; - -final client = await StemClient.fromUrl( - 'redis://localhost:6379', - adapters: const [StemRedisAdapter()], - overrides: const StemStoreOverrides( - backend: 'redis://localhost:6379/1', - ), - tasks: [HelloTask()], -); -``` - -### Direct enqueue (map-based) - -```dart -import 'dart:async'; -import 'package:stem/stem.dart'; -import 'package:stem_redis/stem_redis.dart'; - -class HelloTask implements TaskHandler { - @override - String get name => 'demo.hello'; - - @override - TaskOptions get options => const TaskOptions( - queue: 'default', - maxRetries: 3, - rateLimit: '10/s', - visibilityTimeout: Duration(seconds: 60), - ); - - @override - Future call(TaskContext context, Map args) async { - final who = args['name'] as String? ?? 'world'; - print('Hello $who (attempt ${context.attempt})'); - } -} +import "package:stem/stem.dart"; +import "package:stem_redis/stem_redis.dart"; Future main() async { - final broker = await RedisStreamsBroker.connect('redis://localhost:6379'); - final backend = await RedisResultBackend.connect('redis://localhost:6379/1'); - - final stem = Stem(broker: broker, backend: backend, tasks: [HelloTask()]); - final worker = Worker( - broker: broker, - backend: backend, + final client = await StemClient.fromUrl( + "redis://localhost:6379", + adapters: const [StemRedisAdapter()], + overrides: const StemStoreOverrides( + backend: "redis://localhost:6379/1", + ), tasks: [HelloTask()], ); + final worker = await client.createWorker(); unawaited(worker.start()); - await stem.enqueue('demo.hello', args: {'name': 'Stem'}); + + await client.enqueueValue("demo.hello", const {"name": "Redis"}); await Future.delayed(const Duration(seconds: 1)); + await worker.shutdown(); - await broker.close(); - await backend.close(); + await client.close(); } ``` -### Typed helpers with `TaskDefinition` - -Use the new typed wrapper when you want compile-time checking and shared metadata: +### Typed task definition and waiting for result ```dart -class HelloTask implements TaskHandler { - static final definition = TaskDefinition( - name: 'demo.hello', - encodeArgs: (args) => {'name': args.name}, - metadata: TaskMetadata(description: 'Simple hello world example'), - ); - - @override - String get name => 'demo.hello'; - - @override - TaskOptions get options => const TaskOptions(maxRetries: 3); - - @override - TaskMetadata get metadata => definition.metadata; - - @override - Future call(TaskContext context, Map args) async { - final who = args['name'] as String? ?? 'world'; - print('Hello $who (attempt ${context.attempt})'); - } -} - class HelloArgs { const HelloArgs({required this.name}); final String name; -} -Future main() async { - final broker = await RedisStreamsBroker.connect('redis://localhost:6379'); - final backend = await RedisResultBackend.connect('redis://localhost:6379/1'); - - final stem = Stem(broker: broker, backend: backend, tasks: [HelloTask()]); - final worker = Worker( - broker: broker, - backend: backend, - tasks: [HelloTask()], - ); - - unawaited(worker.start()); - await stem.enqueueCall( - HelloTask.definition(const HelloArgs(name: 'Stem')), - ); - await Future.delayed(const Duration(seconds: 1)); - await worker.shutdown(); - await broker.close(); - await backend.close(); + Map toJson() => {"name": name}; + factory HelloArgs.fromJson(Map json) => + HelloArgs(name: json["name"] as String); } -``` - -You can also build requests fluently with the `TaskEnqueueBuilder`: - -```dart -final taskId = await TaskEnqueueBuilder( - definition: HelloTask.definition, - args: const HelloArgs(name: 'Tenant A'), -) - ..header('x-tenant', 'tenant-a') - ..priority(5) - ..delay(const Duration(seconds: 30)) - .enqueueWith(stem); -``` - -### Enqueue from inside a task - -Handlers can enqueue follow-up work using `TaskContext.enqueue` and request -retries directly: -```dart -class ParentTask implements TaskHandler { - @override - String get name => 'demo.parent'; +class HelloTask2 extends TaskHandler { + static final definition = TaskDefinition.json( + name: "demo.hello2", + metadata: const TaskMetadata(description: "typed hello task"), + ); @override - TaskOptions get options => const TaskOptions(maxRetries: 3); + String get name => definition.name; @override - Future call(TaskContext context, Map args) async { - await context.enqueue( - 'demo.child', - args: {'id': 'child-1'}, - enqueueOptions: TaskEnqueueOptions( - countdown: const Duration(seconds: 30), - retry: true, - retryPolicy: TaskRetryPolicy(backoff: true), - ), - ); - - if (context.attempt == 0) { - await context.retry(countdown: const Duration(seconds: 10)); - } + Future call(TaskContext context, Map args) async { + final payload = HelloArgs.fromJson(args.cast()); + return "Hello ${payload.name}"; } } -``` -### Bootstrap helpers - -Spin up a full runtime in one call using the bootstrap APIs: - -```dart -final app = await StemWorkflowApp.inMemory( - flows: [ - Flow( - name: 'demo.workflow', - build: (flow) { - flow.step('hello', (ctx) async => 'done'); - }, - ), - ], -); +Future main() async { + final client = await StemClient.inMemory(tasks: [HelloTask2()]); + final worker = await client.createWorker(); + unawaited(worker.start()); -final runId = await app.startWorkflow('demo.workflow'); -final result = await app.waitForCompletion(runId); -print(result?.value); // 'hello world' -print(result?.state.status); // WorkflowStatus.completed + final result = await HelloTask2.definition.enqueueAndWait( + client, + const HelloArgs(name: "Typed"), + ); + print(result?.value); -await app.shutdown(); + await worker.shutdown(); + await client.close(); +} ``` -### Workflow script facade - -Prefer the high-level `WorkflowScript` facade when you want to author a -workflow as a single async function. The facade wraps `FlowBuilder` so your -code can `await script.step`, `await step.sleep`, and `await step.awaitEvent` -while retaining the same durability semantics (checkpoints, resume payloads, -auto-versioning) as the lower-level API: +### Workflow quick-start (Flow) ```dart -final app = await StemWorkflowApp.inMemory( - scripts: [ - WorkflowScript( - name: 'orders.workflow', - run: (script) async { - final checkout = await script.step('checkout', (step) async { - return await chargeCustomer(step.params['userId'] as String); - }); - - await script.step('poll-shipment', (step) async { - final resume = step.takeResumeValue(); - if (resume != true) { - await step.sleep(const Duration(seconds: 30)); - return 'waiting'; - } - final status = await fetchShipment(checkout.id); - if (!status.isComplete) { - await step.sleep(const Duration(seconds: 30)); - return 'waiting'; - } - return status.value; - }, autoVersion: true); - - final receipt = await script.step('notify', (step) async { - await sendReceiptEmail(checkout); - return 'emailed'; - }); - - return receipt; - }, - ), - ], -); -``` - -Inside a script step you can access the same metadata as `FlowContext`: - -- `step.previousResult` contains the prior step’s persisted value. -- `step.iteration` tracks the current auto-version suffix when - `autoVersion: true` is set. -- `step.idempotencyKey('scope')` builds stable outbound identifiers. -- `step.takeResumeData()` and `step.takeResumeValue(codec: ...)` surface - payloads from sleeps or awaited events so you can branch on resume paths. - -### Current workflow model - -Stem supports three workflow authoring styles today: - -1. `Flow` for explicit orchestration -2. `WorkflowScript` for function-style durable workflows -3. `stem_builder` for annotated workflows with generated workflow refs +import "package:stem/stem.dart"; -The runtime shape is the same in every case: - -- bootstrap a `StemWorkflowApp` -- pass `flows:`, `scripts:`, and `tasks:` directly -- start runs with `startWorkflow(...)` or generated workflow refs -- wait with `waitForCompletion(...)` - -You do not need to build task registries manually for normal workflow usage. - -#### Manual `Flow` - -Use `Flow` when you want explicit step orchestration and fine control over -resume behavior: - -```dart -final approvalsFlow = Flow( - name: 'approvals.flow', +final onboardingFlow = Flow( + name: "demo.onboarding", build: (flow) { - flow.step('draft', (ctx) async { - final payload = ctx.params['draft'] as Map; - return payload['documentId']; - }); - - flow.step('manager-review', (ctx) async { - final resume = ctx.takeResumeValue>(); - if (resume == null) { - await ctx.awaitEvent('approvals.manager'); - return null; - } - return resume['approvedBy'] as String?; - }); - - flow.step('finalize', (ctx) async { - final approvedBy = ctx.previousResult as String?; - return 'approved-by:$approvedBy'; + flow.step("welcome", (ctx) async { + return "Welcome ${ctx.requiredParam("name")}"; }); + flow.step("done", (ctx) async => "Done"); }, ); -final app = await StemWorkflowApp.fromUrl( - 'memory://', - flows: [approvalsFlow], - tasks: const [], -); - -final runId = await app.startWorkflow( - 'approvals.flow', - params: { - 'draft': {'documentId': 'doc-42'}, - }, -); - -final result = await app.waitForCompletion(runId); -print(result?.value); -await app.close(); -``` - -#### Manual `WorkflowScript` - -Use `WorkflowScript` when you want your workflow to read like a normal async -function while still persisting durable checkpoints: - -```dart -final billingRetryScript = WorkflowScript( - name: 'billing.retry-script', - run: (script) async { - final chargeId = await script.step('charge', (ctx) async { - final resume = ctx.takeResumeValue>(); - if (resume == null) { - await ctx.awaitEvent('billing.charge.prepared'); - return 'pending'; - } - return resume['chargeId'] as String; - }); +Future main() async { + final appClient = await StemClient.inMemory(); + final app = await appClient.createWorkflowApp( + flows: [onboardingFlow], + allowWorkerAutoStart: false, + ); + await app.start(); - return script.step('confirm', (ctx) async { - ctx.idempotencyKey('confirm-$chargeId'); - return 'receipt-$chargeId'; - }); - }, -); + final ref = onboardingFlow.refJson(HelloArgs.fromJson); + final runId = await ref.start(app, params: const HelloArgs(name: "Stem")); + final result = await ref.waitFor(app, runId); -final app = await StemWorkflowApp.inMemory( - scripts: [billingRetryScript], - tasks: const [], -); + print(result?.value); + await app.shutdown(); + await appClient.close(); +} ``` -#### Annotated workflows with `stem_builder` - -Use `stem_builder` when you want the best DX: plain method signatures, -generated manifests, and typed workflow refs. - -The important part of the model is that `run(...)` calls other annotated -methods directly. Those method calls are what become durable script checkpoints in -the generated proxy. - -The conceptual split is: - -- `Flow`: declared steps are the execution plan -- `WorkflowScript`: `run(...)` is the execution plan, and declared checkpoints - are manifest/introspection metadata +### Annotated workflow + task with `stem_builder` ```dart -import 'package:stem/stem.dart'; +import "package:stem/stem.dart"; +import "package:stem_builder/stem_builder.dart"; -part 'definitions.stem.g.dart'; +part "definitions.stem.g.dart"; -@WorkflowDefn(name: 'builder.example.user_signup', kind: WorkflowKind.script) -class BuilderUserSignupWorkflow { - Future> run(String email) async { - final user = await createUser(email); - await sendWelcomeEmail(email); - await sendOneWeekCheckInEmail(email); - return {'userId': user['id'], 'status': 'done'}; +@WorkflowDefn(name: "builder.signup", kind: WorkflowKind.script) +class BuilderSignupWorkflow { + Future run(String email) async { + final userId = await createUser(email); + await finalizeSignup(userId: userId); + return userId; } - @WorkflowStep(name: 'create-user') - Future> createUser(String email) async { - return {'id': 'user:$email'}; + @WorkflowStep(name: "create-user") + Future createUser(String email) async { + return "user-$email"; } - @WorkflowStep(name: 'send-welcome-email') - Future sendWelcomeEmail(String email) async {} - - @WorkflowStep(name: 'send-one-week-check-in-email') - Future sendOneWeekCheckInEmail(String email) async {} + @WorkflowStep(name: "finalize") + Future finalizeSignup({required String userId}) async {} } -@TaskDefn(name: 'builder.example.task') -Future builderExampleTask( - TaskInvocationContext context, - Map args, -) async {} +@TaskDefn(name: "builder.send_welcome") +Future sendWelcomeEmail( + String email, { + TaskExecutionContext? context, +}) async { + // optional: use context for logger/meta/retry helpers +} ``` -There are two supported script entry styles: - -- plain direct-call style: - - `Future run(String email, ...)` - - best when your annotated step methods only take serializable parameters -- context-aware style: - - `@WorkflowRun()` - - `Future run(WorkflowScriptContext script, String email, ...)` - - use this when you need to enter a step explicitly with `script.step(...)` - so the step body can receive `WorkflowScriptStepContext` - -Context injection works at every runtime layer: - -- flow steps can take `FlowContext` -- script runs can take `WorkflowScriptContext` -- script steps can take `WorkflowScriptStepContext` -- tasks can take `TaskInvocationContext` - -Serializable parameter rules for generated workflows and tasks are strict: - -- supported: - - `String`, `bool`, `int`, `double`, `num`, `Object?`, `null` - - `List` where `T` is serializable - - `Map` where `T` is serializable - - DTO classes with: - - `Map toJson()` - - `factory Type.fromJson(Map json)` or an equivalent - named `fromJson` constructor -- not supported directly: - - optional/named parameters on generated workflow/task entrypoints - -Typed task results can use the same DTO convention. - -Workflow inputs, checkpoint values, and final workflow results can use the same -DTO convention. The generated `PayloadCodec` persists the JSON form while -workflow code continues to work with typed objects. - -See the runnable example: - -- [example/annotated_workflows](example/annotated_workflows) - - `FlowContext` metadata - - plain proxy-driven script step calls - - `WorkflowScriptContext` + `WorkflowScriptStepContext` - - codec-backed workflow checkpoint values and workflow results - - typed `@TaskDefn` decoding scalar, `Map`, and `List` parameters - -Generate code: - ```bash dart run build_runner build -``` -Wire the generated bundle directly into `StemWorkflowApp`: +# After generation, use module + generated defs +``` ```dart -final app = await StemWorkflowApp.fromUrl( - 'memory://', - module: stemModule, +// example usage after codegen +final client = await StemClient.inMemory(module: stemModule); +final app = await client.createWorkflowApp(allowWorkerAutoStart: false); +await app.start(); + +final runId = await StemWorkflowDefinitions.builderSignup.startAndWait( + app, + "alice@example.com", ); - -final result = await StemWorkflowDefinitions.userSignup - .call((email: 'user@example.com')) - .startAndWaitWithApp(app); -print(result?.value); -await app.close(); +final result = await StemWorkflowDefinitions.builderSignup.waitFor(app, runId); +print(result?.value); // {user: alice@example.com} ``` -Generated output gives you: - -- `stemModule` -- `StemWorkflowDefinitions` -- `StemTaskDefinitions` -- typed enqueue helpers on `TaskEnqueuer` -- typed result wait helpers on `Stem` - -If your service already owns a `StemApp`, reuse it: +### Workflow with multiple worker queues ```dart -final client = await StemClient.fromUrl( - 'redis://localhost:6379', - adapters: const [StemRedisAdapter()], -); +import "package:stem/stem.dart"; -final workflowApp = await client.createWorkflowApp( - module: stemModule, +final onboardingFlow = Flow>( + name: "workflow.multi_workers", + build: (flow) { + flow.step("dispatch", (ctx) async { + final notifyTaskId = await ctx.enqueue( + "notify.send", + args: {"email": "alex@example.com"}, + enqueueOptions: const TaskEnqueueOptions(queue: "notifications"), + ); + final analyticsTaskId = await ctx.enqueue( + "analytics.track", + args: {"userId": "alex", "event": "account.created"}, + enqueueOptions: const TaskEnqueueOptions(queue: "analytics"), + ); + return {"notifyTaskId": notifyTaskId, "trackTaskId": analyticsTaskId}; + }); + }, ); -``` - -#### Mixing workflows and normal tasks - -A workflow can orchestrate durable steps and still enqueue ordinary Stem tasks -for side effects: - -```dart -flow.step('emit-side-effects', (ctx) async { - final order = ctx.previousResult as Map; - - await ctx.enqueuer!.enqueue( - 'ecommerce.audit.log', - args: { - 'event': 'order.checked_out', - 'entityId': order['id'], - 'detail': 'cart=${order['cartId']}', - }, - options: const TaskOptions(queue: 'default'), - ); - return order; -}); -``` +class NotifyTask extends TaskHandler { + @override + String get name => "notify.send"; -That split is the intended model: + @override + TaskOptions get options => const TaskOptions(queue: "notifications"); -- workflows coordinate durable state transitions -- regular tasks handle side effects and background execution -- both are wired into the same app, and generated modules bundle the two - surfaces together + @override + Future call(TaskContext context, Map args) async => + "notified:${args['email']}"; +} -### Typed workflow completion +class AnalyticsTask extends TaskHandler { + @override + String get name => "analytics.track"; -All workflow definitions (flows and scripts) accept an optional type argument -representing the value they produce. `StemWorkflowApp.waitForCompletion` -exposes the decoded value along with the raw `RunState`, letting you work with -domain models without manual casts: + @override + TaskOptions get options => const TaskOptions(queue: "analytics"); -```dart -final runId = await app.startWorkflow('orders.workflow'); -final result = await app.waitForCompletion( - runId, - decode: (payload) => OrderReceipt.fromJson(payload! as Map), -); -if (result?.isCompleted == true) { - print(result!.value?.total); -} else if (result?.timedOut == true) { - inspectSuspension(result?.state); + @override + Future call(TaskContext context, Map args) async => + "tracked:${args['event']}"; } -``` -In the example above, these calls inside `run(...)`: - -```dart -final user = await createUser(email); -await sendWelcomeEmail(email); -await sendOneWeekCheckInEmail(email); -``` - -are transformed by generated code into durable `script.step(...)` calls. See -the generated proxy in -`packages/stem_builder/example/lib/definitions.stem.g.dart` for the concrete -lowering. +Future main() async { + final client = await StemClient.inMemory(); + final app = await client.createWorkflowApp( + flows: [onboardingFlow], + workerConfig: const StemWorkerConfig(queue: "workflow"), + ); + await app.start(); -### Typed task completion + final notifications = await client.createWorker( + workerConfig: StemWorkerConfig( + queue: "notifications-worker", + consumerName: "notifications-worker", + subscription: RoutingSubscription.singleQueue("notifications"), + ), + tasks: [NotifyTask()], + ); + final analytics = await client.createWorker( + workerConfig: StemWorkerConfig( + queue: "analytics-worker", + consumerName: "analytics-worker", + subscription: RoutingSubscription.singleQueue("analytics"), + ), + tasks: [AnalyticsTask()], + ); -Producers can now wait for individual task results using `Stem.waitForTask` -with optional decoders. The helper returns a `TaskResult` containing the -underlying `TaskStatus`, decoded payload, and a timeout flag: + await notifications.start(); + await analytics.start(); -```dart -final taskId = await stem.enqueueCall( - ChargeCustomer.definition.call(ChargeArgs(orderId: '123')), -); + final result = await onboardingFlow.startAndWait(app); + final taskIds = result?.value ?? const {}; + print(await app.waitForTask(taskIds['notifyTaskId']!)); + print(await app.waitForTask(taskIds['trackTaskId']!)); -final charge = await stem.waitForTask( - taskId, - decode: (payload) => ChargeReceipt.fromJson( - payload! as Map, - ), -); -if (charge?.isSucceeded == true) { - print('Captured ${charge!.value!.total}'); -} else if (charge?.isFailed == true) { - log.severe('Charge failed: ${charge!.status.error}'); + await notifications.shutdown(); + await analytics.shutdown(); + await app.close(); + await client.close(); } ``` -### Typed canvas helpers - -`TaskSignature` (and the `task()` helper) lets you declare the result type -for canvas primitives. The existing `Canvas.group`, `Canvas.chain`, and -`Canvas.chord` APIs now accept generics so typed values flow through sequential -steps, groups, and chords without manual casts: +### 5) CLI at a glance -```dart -final dispatch = await canvas.group([ - task( - 'orders.fetch', - args: {'storeId': 42}, - decode: (payload) => OrderSummary.fromJson( - payload! as Map, - ), - ), - task('orders.refresh'), -]); - -dispatch.results.listen((result) { - if (result.isSucceeded) { - dashboard.update(result.value!); - } -}); - -final chainResult = await canvas.chain([ - task('metrics.seed', args: {'value': 1}), - task('metrics.bump', args: {'add': 3}), -]); -print(chainResult.value); // 4 - -final chordResult = await canvas.chord( - body: [ - task('image.resize', args: {'size': 256}), - task('image.resize', args: {'size': 512}), - ], - callback: task('image.aggregate'), -); -print('Body results: ${chordResult.values}'); +```bash +# Start a worker or run built-in introspection commands +stem --help +stem worker start --help +stem wf --help ``` -### Task payload encoders -By default Stem stores handler arguments/results exactly as provided (JSON-friendly -structures). Configure default `TaskPayloadEncoder`s when bootstrapping -`StemClient`, `StemApp`, `StemWorkflowApp`, or `Canvas` to plug in custom -serialization (encryption, compression, base64 wrappers, etc.) for both task -arguments and persisted results: +### General worker management (multi-worker setup) ```dart -import 'dart:convert'; +import "package:stem/stem.dart"; -class Base64ResultEncoder extends TaskPayloadEncoder { - const Base64ResultEncoder(); +class EmailTask extends TaskHandler { + @override + String get name => "notify.send"; @override - Object? encode(Object? value) { - if (value is String) { - return base64Encode(utf8.encode(value)); - } - return value; - } + TaskOptions get options => const TaskOptions(queue: "notify"); @override - Object? decode(Object? stored) { - if (stored is String) { - return utf8.decode(base64Decode(stored)); - } - return stored; + Future call(TaskContext context, Map args) async { + print("notify queue: ${args['to']}"); } } -final client = await StemClient.inMemory( - tasks: [...], - resultEncoder: const Base64ResultEncoder(), - argsEncoder: const Base64ResultEncoder(), - additionalEncoders: const [MyOtherEncoder()], -); - -final canvas = Canvas( - broker: broker, - backend: backend, - tasks: [SecretTask()], - resultEncoder: const Base64ResultEncoder(), - argsEncoder: const Base64ResultEncoder(), -); -``` - -Every envelope published by Stem carries the argument encoder id in headers/meta -(`stem-args-encoder` / `__stemArgsEncoder`) and every status stored in a result -backend carries the result encoder id (`__stemResultEncoder`). Workers use the same -`TaskPayloadEncoderRegistry` to resolve IDs, ensuring payloads are decoded exactly -once regardless of how many custom encoders you register. - -Per-task overrides live on `TaskMetadata`, so both handlers and the corresponding -`TaskDefinition` share the same configuration: - -```dart -class SecretTask extends TaskHandler { - static const _encoder = Base64ResultEncoder(); +class ReportTask extends TaskHandler { + @override + String get name => "reports.aggregate"; @override - TaskMetadata get metadata => const TaskMetadata( - description: 'Encrypt args + results', - argsEncoder: _encoder, - resultEncoder: _encoder, - ); + TaskOptions get options => const TaskOptions(queue: "reports"); - // ... + @override + Future call(TaskContext context, Map args) async { + print("reports queue: ${args['reportId']}"); + } } -``` - -Encoders run exactly once per persistence/read cycle and fall back to the JSON -behavior when none is provided. - -### Unique task deduplication -Set `TaskOptions(unique: true)` to prevent duplicate enqueues when a matching -task is already in-flight. Stem uses a `UniqueTaskCoordinator` backed by a -`LockStore` (Redis or in-memory) to claim uniqueness before publishing: - -```dart -final lockStore = await RedisLockStore.connect('redis://localhost:6379'); -final unique = UniqueTaskCoordinator( - lockStore: lockStore, - defaultTtl: const Duration(minutes: 5), -); +Future main() async { + final client = await StemClient.inMemory(); -final stem = Stem( - broker: broker, - backend: backend, - tasks: [OrdersSyncTask()], - uniqueTaskCoordinator: unique, -); -``` + final notifyWorker = await client.createWorker( + workerConfig: StemWorkerConfig( + queue: "notify-worker", + consumerName: "notify-worker", + subscription: RoutingSubscription.singleQueue("notify"), + ), + tasks: [EmailTask()], + ); -The unique key is derived from: + final reportsWorker = await client.createWorker( + workerConfig: StemWorkerConfig( + queue: "reports-worker", + consumerName: "reports-worker", + subscription: RoutingSubscription.singleQueue("reports"), + ), + tasks: [ReportTask()], + ); -- task name -- queue name -- task arguments -- headers -- metadata (excluding keys prefixed with `stem.`) + await notifyWorker.start(); + await reportsWorker.start(); -Keys are canonicalized (sorted maps, stable JSON) so equivalent inputs produce -the same hash. Use `uniqueFor` to control the TTL; when unset, the coordinator -falls back to `visibilityTimeout` or its default TTL. + await client.enqueue( + "notify.send", + args: {"to": "ops@example.com"}, + ); + await client.enqueue( + "reports.aggregate", + args: {"reportId": "r-2026-q1"}, + ); -Override the unique key when needed: + await Future.delayed(const Duration(milliseconds: 400)); -```dart -final id = await stem.enqueue( - 'orders.sync', - args: {'id': 42}, - options: const TaskOptions(unique: true, uniqueFor: Duration(minutes: 10)), - meta: {UniqueTaskMetadata.override: 'order-42'}, -); + await notifyWorker.shutdown(); + await reportsWorker.shutdown(); + await client.close(); +} ``` -When a duplicate is skipped, Stem returns the existing task id, emits the -`stem.tasks.deduplicated` metric, and appends a duplicate entry to the result -backend metadata under `stem.unique.duplicates`. - -### Durable workflow semantics - -- Chords dispatch from workers. Once every branch completes, any worker may enqueue the callback, ensuring producer crashes do not block completion. -- Steps may run multiple times. The runtime replays a step from the top after - every suspension (sleep, awaited event, rewind) and after worker crashes, so - handlers must be idempotent. -- Event waits are durable watchers. When a step calls `awaitEvent`, the runtime - registers the run in the store so the next emitted payload is persisted - atomically and delivered exactly once on resume. Operators can inspect - suspended runs via `WorkflowStore.listWatchers` or `runsWaitingOn`. -- Checkpoints act as heartbeats. Every successful `saveStep` refreshes the run's - `updatedAt` timestamp so operators (and future reclaim logic) can distinguish - actively-owned runs from ones that need recovery. -- Run execution is lease-based. The runtime claims each run with a lease - (`runLeaseDuration`) and renews it while work continues. If another worker - owns the lease, the task is retried so a takeover can occur once the lease - expires. Keep `runLeaseDuration` at least as long as the broker visibility - timeout and ensure `leaseExtension` renewals happen before either expires. -- Sleeps persist wake timestamps. When a resumed step calls `sleep` again, the - runtime skips re-suspending once the stored `resumeAt` is reached so loop - handlers can simply call `sleep` without extra guards. -- Use `ctx.takeResumeData()` or `ctx.takeResumeValue(codec: ...)` to detect - whether a step is resuming. Call it at the start of the handler and branch - accordingly. -- When you suspend, provide a marker in the `data` payload so the resumed step - can distinguish the wake-up path. For example: - - ```dart - final resume = ctx.takeResumeValue(); - if (resume != true) { - ctx.sleep(const Duration(milliseconds: 200)); - return null; - } - ``` - -- Awaited events behave the same way: the emitted payload is delivered via - `takeResumeData()` / `takeResumeValue(codec: ...)` when the run resumes. -- When you have a DTO event, emit it through `runtime.emitValue(...)` / - `workflowApp.emitValue(...)` with a `PayloadCodec` instead of hand-building - the payload map. Event payloads still serialize onto the existing - `Map` wire format. -- Only return values you want persisted. If a handler returns `null`, the - runtime treats it as "no result yet" and will run the step again on resume. -- Derive outbound idempotency tokens with `ctx.idempotencyKey('charge')` so - retries reuse the same stable identifier (`workflow/run/scope`). -- Use `autoVersion: true` on steps that you plan to re-execute (e.g. after - rewinding). Each completion stores a checkpoint like `step#0`, `step#1`, ... and - the handler receives the current iteration via `ctx.iteration`. -- Set an optional `WorkflowCancellationPolicy` when starting runs to auto-cancel - workflows that exceed a wall-clock budget or stay suspended beyond an allowed - duration. When a policy trips, the run transitions to `cancelled` and the - reason is surfaced via `stem wf show`. +- Full example that combines a workflow dispatching to dedicated workers: + [multiple_workers.dart](example/workflows/multiple_workers.dart) -```dart -flow.step( - 'process-item', - autoVersion: true, - (ctx) async { - final iteration = ctx.iteration; - final item = items[iteration]; - return await process(item); - }, -); +## Want depth? -final runId = await runtime.startWorkflow( - 'demo.workflow', - params: const {'userId': '42'}, - cancellationPolicy: const WorkflowCancellationPolicy( - maxRunDuration: Duration(minutes: 15), - maxSuspendDuration: Duration(minutes: 5), - ), -); -``` +This README is intentionally example-focused. +For implementation details, runtime semantics, adapter tuning, and operational playbooks, +see the full docs at https://kingwill101.github.io/stem. -Adapter packages expose typed factories (e.g. `redisBrokerFactory`, -`postgresResultBackendFactory`, `sqliteWorkflowStoreFactory`) so you can replace -drivers by importing the adapter you need. - -## Features - -- **Task pipeline** - enqueue with delays, priorities, idempotency helpers, and retries. -- **Workers** - isolate pools with soft/hard time limits, autoscaling, and remote control (`stem worker ping|revoke|shutdown`). -- **Scheduling** - Beat-style scheduler with interval/cron/solar/clocked entries and drift tracking. -- **Workflows** - Durable `Flow` runtime with pluggable stores (in-memory, - Redis, Postgres, SQLite) and CLI introspection via `stem wf`. -- **Observability** - Dartastic OpenTelemetry metrics/traces, heartbeats, CLI inspection (`stem observe`, `stem dlq`). -- **Security** - Payload signing (HMAC or Ed25519), TLS automation scripts, revocation persistence. -- **Adapters** - In-memory drivers included here; Redis Streams and Postgres adapters ship via the `stem_redis` and `stem_postgres` packages. -- **Specs & tooling** - OpenSpec change workflow, quality gates (see `example/quality_gates`), chaos/regression suites. ## Documentation & Examples -- Full docs: [Full docs](.site/docs) (run `npm install && npm start` inside `.site/`). + - Guided onboarding: [Guided onboarding](.site/docs/getting-started/) (install → infra → ops → production). - Examples (each has its own README): - [workflows](example/workflows/) - end-to-end workflow samples (in-memory, sleep/event, SQLite, Redis). See `versioned_rewind.dart` for auto-versioned step rewinds. +- [multiple_workers.dart](example/workflows/multiple_workers.dart) - workflow dispatching tasks to `notifications` and `analytics` workers. - [cancellation_policy](example/workflows/cancellation_policy.dart) - demonstrates auto-cancelling long workflows using `WorkflowCancellationPolicy`. - [rate_limit_delay](example/rate_limit_delay) - delayed enqueue, priority clamping, Redis rate limiter. - [dlq_sandbox](example/dlq_sandbox) - dead-letter inspection and replay via CLI. @@ -920,60 +442,3 @@ drivers by importing the adapter you need. - [security examples](example/security/*) - payload signing + TLS profiles. - [postgres_tls](example/postgres_tls) - Redis broker + Postgres backend secured via the shared `STEM_TLS_*` settings. - [otel_metrics](example/otel_metrics) - OTLP collectors + Grafana dashboards. - -## Running Tests Locally - -Start the dockerised dependencies and export the integration variables before -invoking the test suite: - -```bash -source packages/stem_cli/_init_test_env -dart test -``` - -The helper script launches `packages/stem_cli/docker/testing/docker-compose.yml` -(Redis + Postgres) and populates `STEM_TEST_*` environment variables needed by -the integration suites. - -### Adapter Contract Tests - -Stem ships a reusable adapter contract suite in -`packages/stem_adapter_tests`. Adapter packages (Redis broker/postgres -backend, SQLite adapters, and any future integrations) add it as a -`dev_dependency` and invoke `runBrokerContractTests` / -`runResultBackendContractTests` from their integration tests. The harness -exercises core behaviours-enqueue/ack/nack, dead-letter replay, lease -extension, result persistence, group aggregation, and heartbeat storage-so -all adapters stay aligned with the broker and result backend contracts. See -`test/integration/brokers/postgres_broker_integration_test.dart` and -`test/integration/backends/postgres_backend_integration_test.dart` for -reference usage. - -### Testing helpers - -Use `FakeStem` from `package:stem/stem.dart` in unit tests when you want to -record enqueued jobs without standing up brokers: - -```dart -final fake = FakeStem(); -await fake.enqueue('tasks.email', args: {'id': 1}); -final recorded = fake.enqueues.single; -expect(recorded.name, 'tasks.email'); -``` - -- `FakeWorkflowClock` keeps workflow tests deterministic. Inject the same clock - into your runtime and store, then advance it directly instead of sleeping: - - ```dart - final clock = FakeWorkflowClock(DateTime.utc(2024, 1, 1)); - final store = InMemoryWorkflowStore(clock: clock); - final runtime = WorkflowRuntime( - stem: stem, - store: store, - eventBus: InMemoryEventBus(store), - clock: clock, - ); - - clock.advance(const Duration(seconds: 5)); - final dueRuns = await store.dueRuns(clock.now()); - ``` diff --git a/packages/stem/example/annotated_workflows/README.md b/packages/stem/example/annotated_workflows/README.md index 615f0fc4..c4463a6f 100644 --- a/packages/stem/example/annotated_workflows/README.md +++ b/packages/stem/example/annotated_workflows/README.md @@ -4,26 +4,34 @@ This example shows how to use `@WorkflowDefn`, `@WorkflowStep`, and `@TaskDefn` with the `stem_builder` bundle generator. It now demonstrates the generated script-proxy behavior explicitly: -- a flow step using `FlowContext` -- `run(WelcomeRequest request)` calls annotated step methods directly -- `prepareWelcome(...)` calls other annotated steps -- `deliverWelcome(...)` calls another annotated step from inside an annotated - step -- a second script workflow uses `@WorkflowRun()` plus `WorkflowScriptStepContext` - to expose `runId`, `workflow`, `stepName`, `stepIndex`, and idempotency keys +- a flow step using `WorkflowExecutionContext` +- a flow step starting and waiting on a child workflow through + `StemWorkflowDefinitions.*.startAndWait(context, params: value)` +- `run(WelcomeRequest request)` calls annotated checkpoint methods directly +- `prepareWelcome(...)` calls other annotated checkpoints +- `deliverWelcome(...)` calls another annotated checkpoint from inside an + checkpoint +- a second script workflow uses optional named context injection + (`WorkflowScriptContext? context` / `WorkflowExecutionContext? context`) to + expose `runId`, `workflow`, `stepName`, `stepIndex`, and idempotency keys + while still calling its annotated checkpoint directly from `run(...)` +- a script checkpoint starting and waiting on a child workflow through + `StemWorkflowDefinitions.*.startAndWait(context, params: value)` - a plain script workflow that returns a codec-backed DTO result and persists a codec-backed DTO checkpoint value -- a typed `@TaskDefn` using `TaskInvocationContext` plus codec-backed DTO - input/output types +- a typed `@TaskDefn` using optional named `TaskExecutionContext? context` + plus codec-backed DTO input/output types When you run the example, it prints: -- the flow result with `FlowContext` metadata +- the flow result with `WorkflowExecutionContext` metadata +- the flow child workflow result without a separate `waitFor(...)` call - the plain script result -- the persisted step order for the plain script workflow +- the persisted checkpoint order for the plain script workflow - the persisted JSON form of the plain script DTO checkpoint and DTO result - the context-aware script result with workflow metadata - the persisted JSON form of the context-aware DTO result -- the persisted step order for the context-aware workflow +- the persisted checkpoint order for the context-aware workflow +- the context child workflow result without a separate `waitFor(...)` call - the typed task result showing a decoded DTO result and task invocation metadata @@ -32,20 +40,29 @@ The generated file exposes: - `stemModule` - `StemWorkflowDefinitions` - typed workflow refs for `StemWorkflowApp` and `WorkflowRuntime` -- typed task definitions, enqueue helpers, and typed result wait helpers +- typed task definitions whose advanced explicit transport path uses + `TaskCall` + +When you pass `module: stemModule` into `StemWorkflowApp`, or create a +`StemClient` with `module: stemModule` and then call +`StemClient.createWorkflowApp()`, the worker automatically subscribes to the +workflow queue plus the default queues declared on the bundled task handlers. +This example no longer needs manual `'workflow'` / `'default'` subscription +wiring. ## Serializable parameter rules For `stem_builder`, generated workflow/task entrypoints support required -positional parameters that are either serializable values or codec-backed DTO -types: +positional business parameters that are either serializable values or +codec-backed DTO types. Runtime context can be added separately through an +optional named injected context parameter. - `String`, `bool`, `int`, `double`, `num`, `Object?`, `null` - `List` where `T` is serializable - `Map` where `T` is serializable - Dart classes with: - - `Map toJson()` - - `factory Type.fromJson(Map json)` or an equivalent named + - a string-keyed `toJson()` map (typically `Map`) + - `factory Type.fromJson(Map json)` or an equivalent named `fromJson` constructor Typed task results can use the same DTO convention. diff --git a/packages/stem/example/annotated_workflows/bin/main.dart b/packages/stem/example/annotated_workflows/bin/main.dart index 208b05e1..dbc75bf6 100644 --- a/packages/stem/example/annotated_workflows/bin/main.dart +++ b/packages/stem/example/annotated_workflows/bin/main.dart @@ -4,41 +4,36 @@ import 'package:stem/stem.dart'; import 'package:stem_annotated_workflows/definitions.dart'; Future main() async { - final client = await StemClient.inMemory(); - final app = await client.createWorkflowApp( - module: stemModule, - workerConfig: StemWorkerConfig( - queue: 'workflow', - subscription: RoutingSubscription(queues: ['workflow', 'default']), - ), - ); - await app.start(); + final client = await StemClient.inMemory(module: stemModule); + final app = await client.createWorkflowApp(); - final flowRunId = await StemWorkflowDefinitions.flow - .call(const {}) - .startWithApp(app); + final flowRunId = await StemWorkflowDefinitions.flow.start(app); final flowResult = await StemWorkflowDefinitions.flow.waitFor( app, flowRunId, timeout: const Duration(seconds: 2), ); print('Flow result: ${jsonEncode(flowResult?.value)}'); - - final scriptCall = StemWorkflowDefinitions.script.call( - (request: const WelcomeRequest(email: ' SomeEmail@Example.com ')), + print( + 'Flow child workflow result: ' + '${jsonEncode(flowResult?.value?['childResult'])}', ); - final scriptResult = await scriptCall.startAndWaitWithApp( + + final scriptResult = await StemWorkflowDefinitions.script.startAndWait( app, + params: const WelcomeRequest(email: ' SomeEmail@Example.com '), timeout: const Duration(seconds: 2), ); print('Script result: ${jsonEncode(scriptResult?.value?.toJson())}'); - final scriptDetail = await app.runtime.viewRunDetail(scriptResult!.runId); - final scriptCheckpoints = scriptDetail?.steps - .map((step) => step.baseStepName) + final scriptDetail = await app.viewRunDetail(scriptResult!.runId); + final scriptCheckpoints = scriptDetail?.checkpoints + .map((checkpoint) => checkpoint.baseCheckpointName) .join(' -> '); - final persistedPreparation = scriptDetail?.steps - .firstWhere((step) => step.baseStepName == 'prepare-welcome') + final persistedPreparation = scriptDetail?.checkpoints + .firstWhere( + (checkpoint) => checkpoint.baseCheckpointName == 'prepare-welcome', + ) .value; print('Script checkpoints: $scriptCheckpoints'); print( @@ -47,36 +42,38 @@ Future main() async { print('Persisted script result: ${jsonEncode(scriptDetail?.run.result)}'); print('Script detail: ${jsonEncode(scriptDetail?.toJson())}'); - final contextCall = StemWorkflowDefinitions.contextScript.call( - (request: const WelcomeRequest(email: ' ContextEmail@Example.com ')), - ); - final contextResult = await contextCall.startAndWaitWithApp( - app, - timeout: const Duration(seconds: 2), - ); + final contextResult = await StemWorkflowDefinitions.contextScript + .startAndWait( + app, + params: const WelcomeRequest(email: ' ContextEmail@Example.com '), + timeout: const Duration(seconds: 2), + ); print('Context script result: ${jsonEncode(contextResult?.value?.toJson())}'); - final contextDetail = await app.runtime.viewRunDetail(contextResult!.runId); - final contextCheckpoints = contextDetail?.steps - .map((step) => step.baseStepName) + final contextDetail = await app.viewRunDetail(contextResult!.runId); + final contextCheckpoints = contextDetail?.checkpoints + .map((checkpoint) => checkpoint.baseCheckpointName) .join(' -> '); print('Context script checkpoints: $contextCheckpoints'); print('Persisted context result: ${jsonEncode(contextDetail?.run.result)}'); print('Context script detail: ${jsonEncode(contextDetail?.toJson())}'); - - final typedTaskId = await app.app.stem.enqueueSendEmailTyped( - dispatch: const EmailDispatch( - email: 'typed@example.com', - subject: 'Welcome', - body: 'Codec-backed DTO payloads', - tags: ['welcome', 'transactional', 'annotated'], - ), - meta: const {'origin': 'annotated_workflows_example'}, - ); - final typedTaskResult = await app.app.stem.waitForSendEmailTyped( - typedTaskId, - timeout: const Duration(seconds: 2), + print( + 'Context child workflow result: ' + '${jsonEncode(contextResult.value!.childResult.toJson())}', ); + + final typedTaskResult = await StemTaskDefinitions.sendEmailTyped + .enqueueAndWait( + app, + const EmailDispatch( + email: 'typed@example.com', + subject: 'Welcome', + body: 'Codec-backed DTO payloads', + tags: ['welcome', 'transactional', 'annotated'], + ), + meta: const {'origin': 'annotated_workflows_example'}, + timeout: const Duration(seconds: 2), + ); print('Typed task result: ${jsonEncode(typedTaskResult?.value?.toJson())}'); await app.close(); diff --git a/packages/stem/example/annotated_workflows/lib/definitions.dart b/packages/stem/example/annotated_workflows/lib/definitions.dart index 16938b64..088d5651 100644 --- a/packages/stem/example/annotated_workflows/lib/definitions.dart +++ b/packages/stem/example/annotated_workflows/lib/definitions.dart @@ -7,9 +7,9 @@ class WelcomeRequest { final String email; - Map toJson() => {'email': email}; + Map toJson() => {'email': email}; - factory WelcomeRequest.fromJson(Map json) { + factory WelcomeRequest.fromJson(Map json) { return WelcomeRequest(email: json['email'] as String); } } @@ -27,14 +27,14 @@ class EmailDispatch { final String body; final List tags; - Map toJson() => { + Map toJson() => { 'email': email, 'subject': subject, 'body': body, 'tags': tags, }; - factory EmailDispatch.fromJson(Map json) { + factory EmailDispatch.fromJson(Map json) { return EmailDispatch( email: json['email'] as String, subject: json['subject'] as String, @@ -59,9 +59,9 @@ class EmailDeliveryReceipt { final String email; final String subject; final List tags; - final Map meta; + final Map meta; - Map toJson() => { + Map toJson() => { 'taskId': taskId, 'attempt': attempt, 'email': email, @@ -70,14 +70,14 @@ class EmailDeliveryReceipt { 'meta': meta, }; - factory EmailDeliveryReceipt.fromJson(Map json) { + factory EmailDeliveryReceipt.fromJson(Map json) { return EmailDeliveryReceipt( taskId: json['taskId'] as String, attempt: json['attempt'] as int, email: json['email'] as String, subject: json['subject'] as String, tags: (json['tags'] as List).cast(), - meta: Map.from(json['meta'] as Map), + meta: Map.from(json['meta'] as Map), ); } } @@ -91,12 +91,12 @@ class WelcomePreparation { final String normalizedEmail; final String subject; - Map toJson() => { + Map toJson() => { 'normalizedEmail': normalizedEmail, 'subject': subject, }; - factory WelcomePreparation.fromJson(Map json) { + factory WelcomePreparation.fromJson(Map json) { return WelcomePreparation( normalizedEmail: json['normalizedEmail'] as String, subject: json['subject'] as String, @@ -115,13 +115,13 @@ class WelcomeWorkflowResult { final String subject; final String followUp; - Map toJson() => { + Map toJson() => { 'normalizedEmail': normalizedEmail, 'subject': subject, 'followUp': followUp, }; - factory WelcomeWorkflowResult.fromJson(Map json) { + factory WelcomeWorkflowResult.fromJson(Map json) { return WelcomeWorkflowResult( normalizedEmail: json['normalizedEmail'] as String, subject: json['subject'] as String, @@ -140,6 +140,8 @@ class ContextCaptureResult { required this.idempotencyKey, required this.normalizedEmail, required this.subject, + required this.childRunId, + required this.childResult, }); final String workflow; @@ -150,8 +152,10 @@ class ContextCaptureResult { final String idempotencyKey; final String normalizedEmail; final String subject; + final String childRunId; + final WelcomeWorkflowResult childResult; - Map toJson() => { + Map toJson() => { 'workflow': workflow, 'runId': runId, 'stepName': stepName, @@ -160,9 +164,11 @@ class ContextCaptureResult { 'idempotencyKey': idempotencyKey, 'normalizedEmail': normalizedEmail, 'subject': subject, + 'childRunId': childRunId, + 'childResult': childResult.toJson(), }; - factory ContextCaptureResult.fromJson(Map json) { + factory ContextCaptureResult.fromJson(Map json) { return ContextCaptureResult( workflow: json['workflow'] as String, runId: json['runId'] as String, @@ -172,6 +178,10 @@ class ContextCaptureResult { idempotencyKey: json['idempotencyKey'] as String, normalizedEmail: json['normalizedEmail'] as String, subject: json['subject'] as String, + childRunId: json['childRunId'] as String, + childResult: WelcomeWorkflowResult.fromJson( + Map.from(json['childResult'] as Map), + ), ); } } @@ -179,12 +189,18 @@ class ContextCaptureResult { @WorkflowDefn(name: 'annotated.flow') class AnnotatedFlowWorkflow { @WorkflowStep() - Future?> start(FlowContext ctx) async { - final resume = ctx.takeResumeData(); - if (resume == null) { - ctx.sleep(const Duration(milliseconds: 50)); + Future?> start({ + WorkflowExecutionContext? context, + }) async { + final ctx = context!; + if (!ctx.sleepUntilResumed(const Duration(milliseconds: 50))) { return null; } + final childResult = await StemWorkflowDefinitions.script.startAndWait( + ctx, + params: const WelcomeRequest(email: 'flow-child@example.com'), + timeout: const Duration(seconds: 2), + ); return { 'workflow': ctx.workflow, 'runId': ctx.runId, @@ -192,6 +208,8 @@ class AnnotatedFlowWorkflow { 'stepIndex': ctx.stepIndex, 'iteration': ctx.iteration, 'idempotencyKey': ctx.idempotencyKey(), + 'childRunId': childResult?.runId, + 'childResult': childResult?.value?.toJson(), }; } } @@ -243,24 +261,26 @@ class AnnotatedScriptWorkflow { @WorkflowDefn(name: 'annotated.context_script', kind: WorkflowKind.script) class AnnotatedContextScriptWorkflow { - @WorkflowRun() Future run( - WorkflowScriptContext script, - WelcomeRequest request, - ) async { - return script.step( - 'enter-context-step', - (ctx) => captureContext(ctx, request), - ); + WelcomeRequest request, { + WorkflowScriptContext? context, + }) async { + return captureContext(request); } @WorkflowStep(name: 'capture-context') Future captureContext( - WorkflowScriptStepContext ctx, - WelcomeRequest request, - ) async { + WelcomeRequest request, { + WorkflowExecutionContext? context, + }) async { + final ctx = context!; final normalizedEmail = await normalizeEmail(request.email); final subject = await buildWelcomeSubject(normalizedEmail); + final childResult = await StemWorkflowDefinitions.script.startAndWait( + ctx, + params: WelcomeRequest(email: normalizedEmail), + timeout: const Duration(seconds: 2), + ); return ContextCaptureResult( workflow: ctx.workflow, runId: ctx.runId, @@ -270,6 +290,8 @@ class AnnotatedContextScriptWorkflow { idempotencyKey: ctx.idempotencyKey('welcome'), normalizedEmail: normalizedEmail, subject: subject, + childRunId: childResult!.runId, + childResult: childResult.value!, ); } @@ -286,17 +308,20 @@ class AnnotatedContextScriptWorkflow { @TaskDefn(name: 'send_email', options: TaskOptions(maxRetries: 1)) Future sendEmail( - TaskInvocationContext ctx, - Map args, -) async { + Map args, { + TaskExecutionContext? context, +}) async { + final ctx = context!; + ctx.heartbeat(); // No-op task for example purposes. } @TaskDefn(name: 'send_email_typed', options: TaskOptions(maxRetries: 1)) Future sendEmailTyped( - TaskInvocationContext ctx, - EmailDispatch dispatch, -) async { + EmailDispatch dispatch, { + TaskExecutionContext? context, +}) async { + final ctx = context!; ctx.heartbeat(); await ctx.progress( 100, diff --git a/packages/stem/example/annotated_workflows/lib/definitions.stem.g.dart b/packages/stem/example/annotated_workflows/lib/definitions.stem.g.dart index 42c0f107..94082532 100644 --- a/packages/stem/example/annotated_workflows/lib/definitions.stem.g.dart +++ b/packages/stem/example/annotated_workflows/lib/definitions.stem.g.dart @@ -3,65 +3,36 @@ part of 'definitions.dart'; -Map _stemPayloadMap(Object? value, String typeName) { - if (value is Map) { - return Map.from(value); - } - if (value is Map) { - final result = {}; - value.forEach((key, entry) { - if (key is! String) { - throw StateError('$typeName payload must use string keys.'); - } - result[key] = entry; - }); - return result; - } - throw StateError( - '$typeName payload must decode to Map, got ${value.runtimeType}.', - ); -} - abstract final class StemPayloadCodecs { static final PayloadCodec welcomeWorkflowResult = - PayloadCodec( - encode: (value) => value.toJson(), - decode: (payload) => WelcomeWorkflowResult.fromJson( - _stemPayloadMap(payload, "WelcomeWorkflowResult"), - ), + PayloadCodec.json( + decode: WelcomeWorkflowResult.fromJson, + typeName: "WelcomeWorkflowResult", ); static final PayloadCodec welcomeRequest = - PayloadCodec( - encode: (value) => value.toJson(), - decode: (payload) => - WelcomeRequest.fromJson(_stemPayloadMap(payload, "WelcomeRequest")), + PayloadCodec.json( + decode: WelcomeRequest.fromJson, + typeName: "WelcomeRequest", ); static final PayloadCodec welcomePreparation = - PayloadCodec( - encode: (value) => value.toJson(), - decode: (payload) => WelcomePreparation.fromJson( - _stemPayloadMap(payload, "WelcomePreparation"), - ), + PayloadCodec.json( + decode: WelcomePreparation.fromJson, + typeName: "WelcomePreparation", ); static final PayloadCodec contextCaptureResult = - PayloadCodec( - encode: (value) => value.toJson(), - decode: (payload) => ContextCaptureResult.fromJson( - _stemPayloadMap(payload, "ContextCaptureResult"), - ), + PayloadCodec.json( + decode: ContextCaptureResult.fromJson, + typeName: "ContextCaptureResult", ); static final PayloadCodec emailDispatch = - PayloadCodec( - encode: (value) => value.toJson(), - decode: (payload) => - EmailDispatch.fromJson(_stemPayloadMap(payload, "EmailDispatch")), + PayloadCodec.json( + decode: EmailDispatch.fromJson, + typeName: "EmailDispatch", ); static final PayloadCodec emailDeliveryReceipt = - PayloadCodec( - encode: (value) => value.toJson(), - decode: (payload) => EmailDeliveryReceipt.fromJson( - _stemPayloadMap(payload, "EmailDeliveryReceipt"), - ), + PayloadCodec.json( + decode: EmailDeliveryReceipt.fromJson, + typeName: "EmailDeliveryReceipt", ); } @@ -72,7 +43,7 @@ final List _stemFlows = [ final impl = AnnotatedFlowWorkflow(); flow.step?>( "start", - (ctx) => impl.start(ctx), + (ctx) => impl.start(context: ctx), kind: WorkflowStepKind.task, taskNames: [], ); @@ -129,12 +100,12 @@ class _StemScriptProxy1 extends AnnotatedContextScriptWorkflow { final WorkflowScriptContext _script; @override Future captureContext( - WorkflowScriptStepContext context, - WelcomeRequest request, - ) { + WelcomeRequest request, { + WorkflowExecutionContext? context, + }) { return _script.step( "capture-context", - (context) => super.captureContext(context, request), + (context) => super.captureContext(request, context: context), ); } @@ -159,34 +130,29 @@ final List _stemScripts = [ WorkflowScript( name: "annotated.script", checkpoints: [ - FlowStep.typed( + WorkflowCheckpoint.typed( name: "prepare-welcome", - handler: _stemScriptManifestStepNoop, valueCodec: StemPayloadCodecs.welcomePreparation, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "normalize-email", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "build-welcome-subject", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "deliver-welcome", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "build-follow-up", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), @@ -201,62 +167,54 @@ final List _stemScripts = [ WorkflowScript( name: "annotated.context_script", checkpoints: [ - FlowStep.typed( + WorkflowCheckpoint.typed( name: "capture-context", - handler: _stemScriptManifestStepNoop, valueCodec: StemPayloadCodecs.contextCaptureResult, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "normalize-email", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "build-welcome-subject", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), ], resultCodec: StemPayloadCodecs.contextCaptureResult, run: (script) => _StemScriptProxy1(script).run( - script, StemPayloadCodecs.welcomeRequest.decode( _stemRequireArg(script.params, "request"), ), + context: script, ), ), ]; abstract final class StemWorkflowDefinitions { - static final WorkflowRef, Map?> flow = - WorkflowRef, Map?>( - name: "annotated.flow", - encodeParams: (params) => params, + static final NoArgsWorkflowRef?> flow = + NoArgsWorkflowRef?>(name: "annotated.flow"); + static final WorkflowRef script = + WorkflowRef( + name: "annotated.script", + encodeParams: (params) => { + "request": StemPayloadCodecs.welcomeRequest.encode(params), + }, + decodeResult: StemPayloadCodecs.welcomeWorkflowResult.decode, + ); + static final WorkflowRef contextScript = + WorkflowRef( + name: "annotated.context_script", + encodeParams: (params) => { + "request": StemPayloadCodecs.welcomeRequest.encode(params), + }, + decodeResult: StemPayloadCodecs.contextCaptureResult.decode, ); - static final WorkflowRef<({WelcomeRequest request}), WelcomeWorkflowResult> - script = WorkflowRef<({WelcomeRequest request}), WelcomeWorkflowResult>( - name: "annotated.script", - encodeParams: (params) => { - "request": StemPayloadCodecs.welcomeRequest.encode(params.request), - }, - decodeResult: StemPayloadCodecs.welcomeWorkflowResult.decode, - ); - static final WorkflowRef<({WelcomeRequest request}), ContextCaptureResult> - contextScript = WorkflowRef<({WelcomeRequest request}), ContextCaptureResult>( - name: "annotated.context_script", - encodeParams: (params) => { - "request": StemPayloadCodecs.welcomeRequest.encode(params.request), - }, - decodeResult: StemPayloadCodecs.contextCaptureResult.decode, - ); } -Future _stemScriptManifestStepNoop(FlowContext context) async => null; - Object? _stemRequireArg(Map args, String name) { if (!args.containsKey(name)) { throw ArgumentError('Missing required argument "$name".'); @@ -267,11 +225,18 @@ Object? _stemRequireArg(Map args, String name) { Future _stemTaskAdapter0( TaskInvocationContext context, Map args, +) async { + return await Future.value(sendEmail(args, context: context)); +} + +Future _stemTaskAdapter1( + TaskInvocationContext context, + Map args, ) async { return await Future.value( sendEmailTyped( - context, StemPayloadCodecs.emailDispatch.decode(_stemRequireArg(args, "dispatch")), + context: context, ), ); } @@ -284,95 +249,28 @@ abstract final class StemTaskDefinitions { defaultOptions: const TaskOptions(maxRetries: 1), metadata: const TaskMetadata(), ); - static final TaskDefinition<({EmailDispatch dispatch}), EmailDeliveryReceipt> - sendEmailTyped = - TaskDefinition<({EmailDispatch dispatch}), EmailDeliveryReceipt>( - name: "send_email_typed", - encodeArgs: (args) => { - "dispatch": StemPayloadCodecs.emailDispatch.encode(args.dispatch), - }, - defaultOptions: const TaskOptions(maxRetries: 1), - metadata: const TaskMetadata(), - decodeResult: StemPayloadCodecs.emailDeliveryReceipt.decode, - ); -} - -extension StemGeneratedTaskEnqueuer on TaskEnqueuer { - Future enqueueSendEmail({ - required Map args, - Map headers = const {}, - TaskOptions? options, - DateTime? notBefore, - Map? meta, - TaskEnqueueOptions? enqueueOptions, - }) { - return enqueueCall( - StemTaskDefinitions.sendEmail.call( - args, - headers: headers, - options: options, - notBefore: notBefore, - meta: meta, - enqueueOptions: enqueueOptions, - ), - ); - } - - Future enqueueSendEmailTyped({ - required EmailDispatch dispatch, - Map headers = const {}, - TaskOptions? options, - DateTime? notBefore, - Map? meta, - TaskEnqueueOptions? enqueueOptions, - }) { - return enqueueCall( - StemTaskDefinitions.sendEmailTyped.call( - (dispatch: dispatch), - headers: headers, - options: options, - notBefore: notBefore, - meta: meta, - enqueueOptions: enqueueOptions, - ), - ); - } -} - -extension StemGeneratedTaskResults on Stem { - Future?> waitForSendEmail( - String taskId, { - Duration? timeout, - }) { - return waitForTaskDefinition( - taskId, - StemTaskDefinitions.sendEmail, - timeout: timeout, - ); - } - - Future?> waitForSendEmailTyped( - String taskId, { - Duration? timeout, - }) { - return waitForTaskDefinition( - taskId, - StemTaskDefinitions.sendEmailTyped, - timeout: timeout, - ); - } + static final TaskDefinition + sendEmailTyped = TaskDefinition( + name: "send_email_typed", + encodeArgs: (args) => { + "dispatch": StemPayloadCodecs.emailDispatch.encode(args), + }, + defaultOptions: const TaskOptions(maxRetries: 1), + metadata: const TaskMetadata(), + decodeResult: StemPayloadCodecs.emailDeliveryReceipt.decode, + ); } final List> _stemTasks = >[ FunctionTaskHandler( name: "send_email", - entrypoint: sendEmail, + entrypoint: _stemTaskAdapter0, options: const TaskOptions(maxRetries: 1), metadata: const TaskMetadata(), ), FunctionTaskHandler( name: "send_email_typed", - entrypoint: _stemTaskAdapter0, + entrypoint: _stemTaskAdapter1, options: const TaskOptions(maxRetries: 1), metadata: TaskMetadata( tags: [], diff --git a/packages/stem/example/autoscaling_demo/bin/producer.dart b/packages/stem/example/autoscaling_demo/bin/producer.dart index 567e729a..f32053a9 100644 --- a/packages/stem/example/autoscaling_demo/bin/producer.dart +++ b/packages/stem/example/autoscaling_demo/bin/producer.dart @@ -1,14 +1,18 @@ import 'dart:io'; import 'package:stem/stem.dart'; +import 'package:stem_redis/stem_redis.dart'; import 'package:stem_autoscaling_demo/shared.dart'; Future main() async { final config = StemConfig.fromEnvironment(); - final broker = await connectBroker(config.brokerUrl, tls: config.tls); final backendUrl = config.resultBackendUrl ?? config.brokerUrl; - final backend = await connectBackend(backendUrl, tls: config.tls); - final tasks = buildTasks(); + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), + tasks: buildTasks(), + ); final taskCount = _parseInt('TASKS', fallback: 48, min: 1); final burst = _parseInt('BURST', fallback: 12, min: 1); @@ -21,7 +25,6 @@ Future main() async { 'tasks=$taskCount burst=$burst pauseMs=$pauseMs durationMs=$durationMs', ); - final stem = Stem(broker: broker, tasks: tasks, backend: backend); const options = TaskOptions(queue: autoscaleQueue); if (initialDelayMs > 0) { @@ -30,7 +33,7 @@ Future main() async { for (var i = 0; i < taskCount; i += 1) { final label = 'job-${i + 1}'; - final id = await stem.enqueue( + final id = await client.enqueue( 'autoscale.work', options: options, args: {'label': label, 'durationMs': durationMs}, @@ -41,8 +44,7 @@ Future main() async { } } - await broker.close(); - await backend.close(); + await client.close(); } int _parseInt(String key, {required int fallback, int min = 0}) { diff --git a/packages/stem/example/canvas_patterns/chain_example.dart b/packages/stem/example/canvas_patterns/chain_example.dart index 67e49084..18102ef6 100644 --- a/packages/stem/example/canvas_patterns/chain_example.dart +++ b/packages/stem/example/canvas_patterns/chain_example.dart @@ -3,8 +3,6 @@ import 'dart:async'; import 'package:stem/stem.dart'; Future main() async { - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); final tasks = >[ FunctionTaskHandler( name: 'fetch.user', @@ -13,48 +11,46 @@ Future main() async { FunctionTaskHandler( name: 'enrich.user', entrypoint: (context, args) async { - final prev = context.meta['chainPrevResult'] as String? ?? 'Friend'; + final prev = context.meta.valueOr('chainPrevResult', 'Friend'); return '$prev Lovelace'; }, ), FunctionTaskHandler( name: 'send.email', entrypoint: (context, args) async { - final fullName = context.meta['chainPrevResult'] as String? ?? 'Friend'; + final fullName = context.meta.valueOr( + 'chainPrevResult', + 'Friend', + ); print('Sending email to $fullName'); return null; }, ), ]; - final worker = Worker( - broker: broker, - backend: backend, + final app = await StemApp.inMemory( tasks: tasks, - consumerName: 'chain-worker', - concurrency: 1, - prefetchMultiplier: 1, + workerConfig: const StemWorkerConfig( + consumerName: 'chain-worker', + concurrency: 1, + prefetchMultiplier: 1, + ), ); - await worker.start(); - - final canvas = Canvas(broker: broker, backend: backend, tasks: tasks); - final chainResult = await canvas.chain([ + final chainResult = await app.canvas.chain([ task('fetch.user'), task('enrich.user'), task('send.email'), ]); await _waitFor(() async { - final status = await backend.get(chainResult.finalTaskId); + final status = await app.getTaskStatus(chainResult.finalTaskId); return status?.state == TaskState.succeeded; }); - final status = await backend.get(chainResult.finalTaskId); + final status = await app.getTaskStatus(chainResult.finalTaskId); print('Chain completed with state: ${status?.state}'); - await worker.shutdown(); - await backend.close(); - await broker.close(); + await app.shutdown(); } Future _waitFor( diff --git a/packages/stem/example/canvas_patterns/chord_example.dart b/packages/stem/example/canvas_patterns/chord_example.dart index 4d9034a5..f504e1ec 100644 --- a/packages/stem/example/canvas_patterns/chord_example.dart +++ b/packages/stem/example/canvas_patterns/chord_example.dart @@ -3,24 +3,21 @@ import 'dart:async'; import 'package:stem/stem.dart'; Future main() async { - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); final tasks = >[ FunctionTaskHandler( name: 'fetch.metric', entrypoint: (context, args) async { await Future.delayed(const Duration(milliseconds: 40)); - return args['value'] as int; + return args.requiredValue('value'); }, ), FunctionTaskHandler( name: 'aggregate.metric', entrypoint: (context, args) async { - final values = - (context.meta['chordResults'] as List?) - ?.whereType() - .toList() ?? - const []; + final values = context.meta.valueListOr( + 'chordResults', + const [], + ); final sum = values.fold(0, (a, b) => a + b); print('Aggregated result: $sum'); return null; @@ -28,18 +25,15 @@ Future main() async { ), ]; - final worker = Worker( - broker: broker, - backend: backend, + final app = await StemApp.inMemory( tasks: tasks, - consumerName: 'chord-worker', - concurrency: 3, - prefetchMultiplier: 1, + workerConfig: const StemWorkerConfig( + consumerName: 'chord-worker', + concurrency: 3, + prefetchMultiplier: 1, + ), ); - await worker.start(); - - final canvas = Canvas(broker: broker, backend: backend, tasks: tasks); - final chordResult = await canvas.chord( + final chordResult = await app.canvas.chord( body: [ task('fetch.metric', args: {'value': 5}), task('fetch.metric', args: {'value': 7}), @@ -51,16 +45,14 @@ Future main() async { final callbackId = chordResult.callbackTaskId; await _waitFor(() async { - final status = await backend.get(callbackId); + final status = await app.getTaskStatus(callbackId); return status?.state == TaskState.succeeded; }); - final callbackStatus = await backend.get(callbackId); + final callbackStatus = await app.getTaskStatus(callbackId); print('Callback state: ${callbackStatus?.state}'); - await worker.shutdown(); - await backend.close(); - await broker.close(); + await app.shutdown(); } Future _waitFor( diff --git a/packages/stem/example/canvas_patterns/group_example.dart b/packages/stem/example/canvas_patterns/group_example.dart index 4ab069a8..37e95003 100644 --- a/packages/stem/example/canvas_patterns/group_example.dart +++ b/packages/stem/example/canvas_patterns/group_example.dart @@ -1,8 +1,6 @@ import 'package:stem/stem.dart'; Future main() async { - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); final tasks = >[ FunctionTaskHandler( name: 'square', @@ -14,20 +12,16 @@ Future main() async { ), ]; - final worker = Worker( - broker: broker, - backend: backend, + final app = await StemApp.inMemory( tasks: tasks, - consumerName: 'group-worker', - concurrency: 2, - prefetchMultiplier: 1, + workerConfig: const StemWorkerConfig( + consumerName: 'group-worker', + concurrency: 2, + prefetchMultiplier: 1, + ), ); - await worker.start(); - - final canvas = Canvas(broker: broker, backend: backend, tasks: tasks); const groupHandle = 'squares-demo'; - await backend.initGroup(GroupDescriptor(id: groupHandle, expected: 3)); - final dispatch = await canvas.group([ + final dispatch = await app.canvas.group([ task('square', args: {'value': 2}), task('square', args: {'value': 3}), task('square', args: {'value': 4}), @@ -38,11 +32,9 @@ Future main() async { .where((value) => value != null) .cast() .toList(); - final status = await backend.getGroup(groupHandle); + final status = await app.getGroupStatus(groupHandle); print('Group results: $squares (backend count: ${status?.results.length})'); await dispatch.dispose(); - await worker.shutdown(); - await backend.close(); - await broker.close(); + await app.shutdown(); } diff --git a/packages/stem/example/dlq_sandbox/bin/producer.dart b/packages/stem/example/dlq_sandbox/bin/producer.dart index f900c843..ced5bbd3 100644 --- a/packages/stem/example/dlq_sandbox/bin/producer.dart +++ b/packages/stem/example/dlq_sandbox/bin/producer.dart @@ -1,6 +1,7 @@ import 'dart:io'; import 'package:stem/stem.dart'; +import 'package:stem_redis/stem_redis.dart'; import 'package:stem_dlq_sandbox/shared.dart'; Future main() async { @@ -11,13 +12,12 @@ Future main() async { stdout.writeln('[producer] connecting broker=$brokerUrl backend=$backendUrl'); - final broker = await connectBroker(brokerUrl); - final backend = await connectBackend(backendUrl); final tasks = buildTasks(); - final stem = buildStem( - broker: broker, + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, ); final invoices = List.generate( @@ -29,7 +29,7 @@ Future main() async { '[producer] enqueueing invoices $invoices (all expected to fail first)'); // #region dlq-producer-enqueue for (final invoice in invoices) { - final id = await stem.enqueue( + final id = await client.enqueue( taskName(), args: { 'invoiceId': invoice, @@ -49,7 +49,6 @@ Future main() async { stdout.writeln('[producer] jobs queued. Waiting 3s before exit...'); await Future.delayed(const Duration(seconds: 3)); - await broker.close(); - await backend.close(); + await client.close(); stdout.writeln('[producer] done.'); } diff --git a/packages/stem/example/dlq_sandbox/lib/shared.dart b/packages/stem/example/dlq_sandbox/lib/shared.dart index 779787ff..e6deeb80 100644 --- a/packages/stem/example/dlq_sandbox/lib/shared.dart +++ b/packages/stem/example/dlq_sandbox/lib/shared.dart @@ -20,18 +20,6 @@ List> buildTasks() => [ ), ]; -Stem buildStem({ - required Broker broker, - required Iterable> tasks, - ResultBackend? backend, -}) { - return Stem( - broker: broker, - tasks: tasks, - backend: backend, - ); -} - Future connectBroker(String uri) => RedisStreamsBroker.connect(uri); diff --git a/packages/stem/example/docs_snippets/lib/best_practices.dart b/packages/stem/example/docs_snippets/lib/best_practices.dart index 84f16cc4..7f4e086b 100644 --- a/packages/stem/example/docs_snippets/lib/best_practices.dart +++ b/packages/stem/example/docs_snippets/lib/best_practices.dart @@ -1,8 +1,6 @@ // Best practices snippets for documentation. // ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print -import 'dart:async'; - import 'package:stem/stem.dart'; // #region best-practices-task @@ -25,8 +23,8 @@ class IdempotentTask extends TaskHandler { // #endregion best-practices-task // #region best-practices-enqueue -Future enqueueTyped(Stem stem) async { - await stem.enqueue( +Future enqueueTyped(TaskEnqueuer enqueuer) async { + await enqueuer.enqueue( 'orders.sync', args: {'orderId': 'order-42'}, meta: {'requestId': 'req-001'}, @@ -35,22 +33,10 @@ Future enqueueTyped(Stem stem) async { // #endregion best-practices-enqueue Future main() async { - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); - final tasks = [IdempotentTask()]; - final stem = Stem(broker: broker, backend: backend, tasks: tasks); - - final worker = Worker( - broker: broker, - backend: backend, - tasks: tasks, - queue: 'default', + final app = await StemApp.inMemory( + tasks: [IdempotentTask()], ); - unawaited(worker.start()); - await enqueueTyped(stem); - await Future.delayed(const Duration(milliseconds: 200)); - await worker.shutdown(); - await broker.close(); - await backend.close(); + await enqueueTyped(app); + await app.close(); } diff --git a/packages/stem/example/docs_snippets/lib/canvas_batch.dart b/packages/stem/example/docs_snippets/lib/canvas_batch.dart index 3374c6b8..698996b0 100644 --- a/packages/stem/example/docs_snippets/lib/canvas_batch.dart +++ b/packages/stem/example/docs_snippets/lib/canvas_batch.dart @@ -21,7 +21,6 @@ Future main() async { prefetchMultiplier: 1, ), ); - await app.start(); final submission = await app.canvas.submitBatch([ task('batch.double', args: {'value': 1}), diff --git a/packages/stem/example/docs_snippets/lib/canvas_chain.dart b/packages/stem/example/docs_snippets/lib/canvas_chain.dart index 4aa353f8..27d9b34b 100644 --- a/packages/stem/example/docs_snippets/lib/canvas_chain.dart +++ b/packages/stem/example/docs_snippets/lib/canvas_chain.dart @@ -1,5 +1,5 @@ // Canvas chain example for documentation. -// ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print +// ignore_for_file: avoid_print import 'dart:async'; @@ -16,15 +16,20 @@ Future main() async { FunctionTaskHandler( name: 'enrich.user', entrypoint: (context, args) async { - final prev = context.meta['chainPrevResult'] as String? ?? 'Friend'; + final prev = context.meta.valueOr( + 'chainPrevResult', + 'Friend', + ); return '$prev Lovelace'; }, ), FunctionTaskHandler( name: 'send.email', entrypoint: (context, args) async { - final fullName = - context.meta['chainPrevResult'] as String? ?? 'Friend'; + final fullName = context.meta.valueOr( + 'chainPrevResult', + 'Friend', + ); print('Sending email to $fullName'); return null; }, @@ -36,7 +41,6 @@ Future main() async { prefetchMultiplier: 1, ), ); - await app.start(); final canvas = app.canvas; final chainResult = await canvas.chain([ diff --git a/packages/stem/example/docs_snippets/lib/canvas_chord.dart b/packages/stem/example/docs_snippets/lib/canvas_chord.dart index eb19d664..84f77f55 100644 --- a/packages/stem/example/docs_snippets/lib/canvas_chord.dart +++ b/packages/stem/example/docs_snippets/lib/canvas_chord.dart @@ -1,5 +1,5 @@ // Canvas chord example for documentation. -// ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print +// ignore_for_file: avoid_print import 'dart:async'; @@ -13,17 +13,16 @@ Future main() async { name: 'fetch.metric', entrypoint: (context, args) async { await Future.delayed(const Duration(milliseconds: 40)); - return args['value'] as int; + return args.requiredValue('value'); }, ), FunctionTaskHandler( name: 'aggregate.metric', entrypoint: (context, args) async { - final values = - (context.meta['chordResults'] as List?) - ?.whereType() - .toList() ?? - const []; + final values = context.meta.valueListOr( + 'chordResults', + const [], + ); final sum = values.fold(0, (a, b) => a + b); print('Aggregated result: $sum'); return null; @@ -36,7 +35,6 @@ Future main() async { prefetchMultiplier: 1, ), ); - await app.start(); final canvas = app.canvas; final chordResult = await canvas.chord( diff --git a/packages/stem/example/docs_snippets/lib/canvas_group.dart b/packages/stem/example/docs_snippets/lib/canvas_group.dart index 464d0977..c3de32af 100644 --- a/packages/stem/example/docs_snippets/lib/canvas_group.dart +++ b/packages/stem/example/docs_snippets/lib/canvas_group.dart @@ -1,5 +1,5 @@ // Canvas group example for documentation. -// ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print +// ignore_for_file: avoid_print import 'dart:async'; @@ -12,7 +12,7 @@ Future main() async { FunctionTaskHandler( name: 'square', entrypoint: (context, args) async { - final value = args['value'] as int; + final value = args.requiredValue('value'); await Future.delayed(const Duration(milliseconds: 50)); return value * value; }, @@ -24,23 +24,21 @@ Future main() async { prefetchMultiplier: 1, ), ); - await app.start(); final canvas = app.canvas; - const groupHandle = 'squares-demo'; - await canvas.group([ + final dispatch = await canvas.group([ task('square', args: {'value': 2}), task('square', args: {'value': 3}), task('square', args: {'value': 4}), - ], groupId: groupHandle); + ]); await _waitFor(() async { - final status = await app.backend.getGroup(groupHandle); + final status = await app.getGroupStatus(dispatch.groupId); return status?.results.length == 3; }); - final groupStatus = await app.backend.getGroup(groupHandle); - final values = groupStatus?.results.values.map((s) => s.payload).toList(); + final groupStatus = await app.getGroupStatus(dispatch.groupId); + final values = groupStatus?.resultValues().values.toList(); print('Group results: $values'); await app.close(); diff --git a/packages/stem/example/docs_snippets/lib/developer_environment.dart b/packages/stem/example/docs_snippets/lib/developer_environment.dart index df2b60b2..bb4f13a9 100644 --- a/packages/stem/example/docs_snippets/lib/developer_environment.dart +++ b/packages/stem/example/docs_snippets/lib/developer_environment.dart @@ -13,25 +13,22 @@ Future bootstrapStem(List> tasks) async { // #endregion dev-env-config // #region dev-env-adapters - final broker = await RedisStreamsBroker.connect( + final stack = StemStack.fromUrl( config.brokerUrl, - tls: config.tls, - ); - final backend = await RedisResultBackend.connect( - _resolveRedisUrl(config.brokerUrl, config.resultBackendUrl, 1), - tls: config.tls, - ); - final revokeStore = await RedisRevokeStore.connect( - _resolveRedisUrl(config.brokerUrl, config.revokeStoreUrl, 2), + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides( + backend: _resolveRedisUrl(config.brokerUrl, config.resultBackendUrl, 1), + revoke: _resolveRedisUrl(config.brokerUrl, config.revokeStoreUrl, 2), + ), + requireRevokeStore: true, ); + final revokeStore = await stack.revokeStore!.create(); final routing = await _loadRoutingRegistry(config); final rateLimiter = await connectRateLimiter(config); // #endregion dev-env-adapters // #region dev-env-stem - final stem = Stem( - broker: broker, - backend: backend, + final client = await stack.createClient( tasks: tasks, routing: routing, ); @@ -39,27 +36,26 @@ Future bootstrapStem(List> tasks) async { // #region dev-env-worker final subscription = _buildSubscription(config); - final worker = Worker( - broker: broker, - backend: backend, - tasks: tasks, - revokeStore: revokeStore, - rateLimiter: rateLimiter, - queue: config.defaultQueue, - subscription: subscription, - concurrency: 8, - autoscale: const WorkerAutoscaleConfig( - enabled: true, - minConcurrency: 2, - maxConcurrency: 16, - backlogPerIsolate: 2.0, - idlePeriod: Duration(seconds: 45), + final worker = await client.createWorker( + workerConfig: StemWorkerConfig( + revokeStore: revokeStore, + rateLimiter: rateLimiter, + queue: config.defaultQueue, + subscription: subscription, + concurrency: 8, + autoscale: const WorkerAutoscaleConfig( + enabled: true, + minConcurrency: 2, + maxConcurrency: 16, + backlogPerIsolate: 2.0, + idlePeriod: Duration(seconds: 45), + ), ), ); // #endregion dev-env-worker return Bootstrap( - stem: stem, + client: client, worker: worker, config: config, rateLimiter: rateLimiter, @@ -68,13 +64,13 @@ Future bootstrapStem(List> tasks) async { class Bootstrap { Bootstrap({ - required this.stem, + required this.client, required this.worker, required this.config, required this.rateLimiter, }); - final Stem stem; + final StemClient client; final Worker worker; final StemConfig config; final RateLimiter? rateLimiter; @@ -86,17 +82,7 @@ Future runCanvasFlows( Bootstrap bootstrap, List> tasks, ) async { - final canvas = Canvas( - broker: bootstrap.stem.broker, - backend: await RedisResultBackend.connect( - _resolveRedisUrl( - bootstrap.config.brokerUrl, - bootstrap.config.resultBackendUrl, - 1, - ), - ), - tasks: tasks, - ); + final canvas = bootstrap.client.createCanvas(tasks: tasks); final ids = await canvas.group([ task('media.resize', args: {'file': 'hero.png'}), @@ -122,15 +108,21 @@ Future runCanvasFlows( // #region dev-env-status Future inspectChordStatus(String chordId) async { - final backend = await RedisResultBackend.connect( - _resolveRedisUrl( - Platform.environment['STEM_BROKER_URL']!, - Platform.environment['STEM_RESULT_BACKEND_URL'], - 1, + final config = StemConfig.fromEnvironment(Platform.environment); + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides( + backend: _resolveRedisUrl( + config.brokerUrl, + config.resultBackendUrl, + 1, + ), ), ); - final status = await backend.get(chordId); + final status = await client.getTaskStatus(chordId); print('Chord completion state: ${status?.state}'); + await client.close(); } // #endregion dev-env-status @@ -221,4 +213,5 @@ Future main() async { await runCanvasFlows(bootstrap, tasks); await Future.delayed(const Duration(seconds: 1)); await bootstrap.worker.shutdown(); + await bootstrap.client.close(); } diff --git a/packages/stem/example/docs_snippets/lib/first_steps.dart b/packages/stem/example/docs_snippets/lib/first_steps.dart index 80ecc6e8..fc19c2ca 100644 --- a/packages/stem/example/docs_snippets/lib/first_steps.dart +++ b/packages/stem/example/docs_snippets/lib/first_steps.dart @@ -31,11 +31,10 @@ Future runInMemoryDemo() async { consumerName: 'first-steps-worker', ), ); - await app.start(); // #endregion first-steps-bootstrap // #region first-steps-enqueue - final taskId = await app.stem.enqueue( + final taskId = await app.enqueue( 'email.send', args: {'to': 'hello@example.com'}, ); @@ -43,7 +42,7 @@ Future runInMemoryDemo() async { // #endregion first-steps-enqueue // #region first-steps-results - final result = await app.stem.waitForTask(taskId); + final result = await app.waitForTask(taskId); print('Task state: ${result?.status.state} value=${result?.value}'); // #endregion first-steps-results diff --git a/packages/stem/example/docs_snippets/lib/observability.dart b/packages/stem/example/docs_snippets/lib/observability.dart index b1d88e0b..a7d8b84e 100644 --- a/packages/stem/example/docs_snippets/lib/observability.dart +++ b/packages/stem/example/docs_snippets/lib/observability.dart @@ -1,7 +1,6 @@ // Observability snippets for documentation. // ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print -import 'package:contextual/contextual.dart'; import 'package:stem/stem.dart'; // #region observability-metrics @@ -11,16 +10,12 @@ void configureMetrics() { // #endregion observability-metrics // #region observability-tracing -Stem buildTracedStem( - Broker broker, - ResultBackend backend, +Future buildTracedStem( Iterable> tasks, ) { // Configure OpenTelemetry globally; StemTracer.instance reads from it. final _ = StemTracer.instance; - return Stem( - broker: broker, - backend: backend, + return StemClient.inMemory( tasks: tasks, ); } @@ -48,6 +43,7 @@ void recordQueueDepth(String queue, int depth) { // #region observability-logging void logTaskStart(Envelope envelope) { + configureStemLogging(format: StemLogFormat.pretty); stemLogger.info( 'Task started', Context({'task': envelope.name, 'id': envelope.id}), @@ -57,6 +53,7 @@ void logTaskStart(Envelope envelope) { final metrics = MetricsCollector(); final heartbeatGauge = GaugeMetric(); +final traceTaskDefinition = TaskDefinition.noArgs(name: 'demo.trace'); class MetricsCollector { void recordRetry({required Duration delay}) {} @@ -72,7 +69,7 @@ Future main() async { final tasks = [ FunctionTaskHandler( - name: 'demo.trace', + name: traceTaskDefinition.name, entrypoint: (context, args) async { print('Tracing demo task'); return null; @@ -80,17 +77,14 @@ Future main() async { ), ]; - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); - final stem = buildTracedStem(broker, backend, tasks); + final client = await buildTracedStem(tasks); logTaskStart( Envelope( - name: 'demo.trace', + name: traceTaskDefinition.name, args: const {}, ), ); - await stem.enqueue('demo.trace', args: const {}); - await backend.close(); - await broker.close(); + await traceTaskDefinition.enqueue(client); + await client.close(); } diff --git a/packages/stem/example/docs_snippets/lib/persistence.dart b/packages/stem/example/docs_snippets/lib/persistence.dart index 557c54ce..2a51f13f 100644 --- a/packages/stem/example/docs_snippets/lib/persistence.dart +++ b/packages/stem/example/docs_snippets/lib/persistence.dart @@ -9,9 +9,11 @@ import 'package:stem_postgres/stem_postgres.dart'; import 'package:stem_redis/stem_redis.dart'; import 'package:stem_sqlite/stem_sqlite.dart'; +final demoTaskDefinition = TaskDefinition.noArgs(name: 'demo'); + final demoTasks = [ FunctionTaskHandler( - name: 'demo', + name: demoTaskDefinition.name, entrypoint: (context, args) async { print('Handled demo task'); return null; @@ -21,63 +23,54 @@ final demoTasks = [ // #region persistence-backend-in-memory Future connectInMemoryBackend() async { - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); - final stem = Stem( - broker: broker, - backend: backend, - tasks: demoTasks, - ); - await stem.enqueue('demo', args: {}); - await backend.close(); - await broker.close(); + final client = await StemClient.inMemory(tasks: demoTasks); + await demoTaskDefinition.enqueue(client); + await client.close(); } // #endregion persistence-backend-in-memory // #region persistence-backend-redis Future connectRedisBackend() async { - final backend = await RedisResultBackend.connect('redis://localhost:6379/1'); - final broker = await RedisStreamsBroker.connect('redis://localhost:6379'); - final stem = Stem( - broker: broker, - backend: backend, + final client = await StemClient.fromUrl( + 'redis://localhost:6379', + adapters: const [StemRedisAdapter()], + overrides: const StemStoreOverrides( + backend: 'redis://localhost:6379/1', + ), tasks: demoTasks, ); - await stem.enqueue('demo', args: {}); - await backend.close(); - await broker.close(); + await demoTaskDefinition.enqueue(client); + await client.close(); } // #endregion persistence-backend-redis // #region persistence-backend-postgres Future connectPostgresBackend() async { - final backend = await PostgresResultBackend.connect( - connectionString: 'postgres://postgres:postgres@localhost:5432/stem', - ); - final broker = await RedisStreamsBroker.connect('redis://localhost:6379'); - final stem = Stem( - broker: broker, - backend: backend, + final client = await StemClient.fromUrl( + 'redis://localhost:6379', + adapters: const [StemRedisAdapter(), StemPostgresAdapter()], + overrides: const StemStoreOverrides( + backend: 'postgres://postgres:postgres@localhost:5432/stem', + ), tasks: demoTasks, ); - await stem.enqueue('demo', args: {}); - await backend.close(); - await broker.close(); + await demoTaskDefinition.enqueue(client); + await client.close(); } // #endregion persistence-backend-postgres // #region persistence-backend-sqlite Future connectSqliteBackend() async { - final broker = await SqliteBroker.open(File('stem_broker.sqlite')); - final backend = await SqliteResultBackend.open(File('stem_backend.sqlite')); - final stem = Stem( - broker: broker, - backend: backend, + final client = await StemClient.fromUrl( + 'sqlite:///${File('stem_broker.sqlite').absolute.path}', + adapters: const [StemSqliteAdapter()], + overrides: StemStoreOverrides( + backend: 'sqlite:///${File('stem_backend.sqlite').absolute.path}', + ), tasks: demoTasks, ); - await stem.enqueue('demo', args: {}); - await backend.close(); - await broker.close(); + await demoTaskDefinition.enqueue(client); + await client.close(); } // #endregion persistence-backend-sqlite diff --git a/packages/stem/example/docs_snippets/lib/producer.dart b/packages/stem/example/docs_snippets/lib/producer.dart index 3105e629..9f8054ad 100644 --- a/packages/stem/example/docs_snippets/lib/producer.dart +++ b/packages/stem/example/docs_snippets/lib/producer.dart @@ -8,7 +8,7 @@ import 'package:stem_redis/stem_redis.dart'; // #region producer-in-memory Future enqueueInMemory() async { - final app = await StemApp.inMemory( + final client = await StemClient.inMemory( tasks: [ FunctionTaskHandler( name: 'hello.print', @@ -20,14 +20,17 @@ Future enqueueInMemory() async { ), ], ); + final app = await client.createApp(); - final taskId = await app.stem.enqueue( + final taskId = await app.enqueue( 'hello.print', args: {'name': 'Stem'}, ); + await app.waitForTask(taskId); print('Enqueued $taskId'); await app.close(); + await client.close(); } // #endregion producer-in-memory @@ -36,8 +39,6 @@ Future enqueueWithRedis() async { final brokerUrl = Platform.environment['STEM_BROKER_URL'] ?? 'redis://localhost:6379'; - final broker = await RedisStreamsBroker.connect(brokerUrl); - final backend = await RedisResultBackend.connect('$brokerUrl/1'); final tasks = [ FunctionTaskHandler( name: 'reports.generate', @@ -49,31 +50,26 @@ Future enqueueWithRedis() async { ), ]; - final stem = Stem( - broker: broker, - backend: backend, + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides(backend: '$brokerUrl/1'), tasks: tasks, ); - await stem.enqueue( + await client.enqueue( 'reports.generate', args: {'reportId': 'monthly-2025-10'}, options: const TaskOptions(queue: 'reports', maxRetries: 3), meta: {'requestedBy': 'finance'}, ); - await backend.close(); - await broker.close(); + await client.close(); } // #endregion producer-redis // #region producer-signed Future enqueueWithSigning() async { final config = StemConfig.fromEnvironment(); - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); - final backend = InMemoryResultBackend(); final tasks = [ FunctionTaskHandler( name: 'billing.charge', @@ -84,20 +80,20 @@ Future enqueueWithSigning() async { }, ), ]; - final stem = Stem( - broker: broker, - backend: backend, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: const StemStoreOverrides(backend: 'memory://'), tasks: tasks, signer: PayloadSigner.maybe(config.signing), ); - await stem.enqueue( + await client.enqueue( 'billing.charge', args: {'customerId': 'cust_123', 'amount': 4200}, notBefore: DateTime.now().add(const Duration(minutes: 5)), ); - await backend.close(); - await broker.close(); + await client.close(); } // #endregion producer-signed @@ -122,25 +118,24 @@ class GenerateReportTask extends TaskHandler { @override Future call(TaskContext context, Map args) async { - final id = args['reportId'] as String; - return await generateReport(id); + final id = args['reportId'] as String?; + return generateReport(id!); } } Future enqueueTyped() async { - final app = await StemApp.inMemory(tasks: [GenerateReportTask()]); - await app.start(); + final client = await StemClient.inMemory(tasks: [GenerateReportTask()]); + final app = await client.createApp(); - final call = GenerateReportTask.definition.call( + final result = await GenerateReportTask.definition.enqueueAndWait( + app, const ReportPayload(reportId: 'monthly-2025-10'), options: const TaskOptions(priority: 5), headers: const {'x-requested-by': 'analytics'}, ); - - final taskId = await app.stem.enqueueCall(call); - final result = await app.stem.waitForTask(taskId); print(result?.value); await app.close(); + await client.close(); } // #endregion producer-typed @@ -154,14 +149,16 @@ class AesPayloadEncoder extends TaskPayloadEncoder { } Future configureProducerEncoders() async { - final app = await StemApp.inMemory( + final client = await StemClient.inMemory( tasks: const [], argsEncoder: const AesPayloadEncoder(), resultEncoder: const JsonTaskPayloadEncoder(), additionalEncoders: const [CustomBinaryEncoder()], ); + final app = await client.createApp(); await app.close(); + await client.close(); } // #endregion producer-encoders diff --git a/packages/stem/example/docs_snippets/lib/production_checklist.dart b/packages/stem/example/docs_snippets/lib/production_checklist.dart index a78e310d..3ba785fd 100644 --- a/packages/stem/example/docs_snippets/lib/production_checklist.dart +++ b/packages/stem/example/docs_snippets/lib/production_checklist.dart @@ -15,8 +15,6 @@ Future configureSigning() async { // #endregion production-signing-signer // #region production-signing-registry - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); final tasks = [ FunctionTaskHandler( name: 'audit.log', @@ -29,33 +27,27 @@ Future configureSigning() async { // #endregion production-signing-registry // #region production-signing-runtime - // #region production-signing-stem - final stem = Stem( - broker: broker, - backend: backend, + // #region production-signing-client + final client = await StemClient.create( + broker: StemBrokerFactory.inMemory(), + backend: StemBackendFactory.inMemory(), tasks: tasks, signer: signer, ); - // #endregion production-signing-stem + // #endregion production-signing-client // #region production-signing-worker - final worker = Worker( - broker: broker, - backend: backend, - tasks: tasks, - signer: signer, - ); + final worker = await client.createWorker(); // #endregion production-signing-worker // #endregion production-signing-runtime // #region production-signing-enqueue - await stem.enqueue('audit.log', args: {'message': 'hello'}); + await client.enqueue('audit.log', args: {'message': 'hello'}); // #endregion production-signing-enqueue // #region production-signing-shutdown await worker.shutdown(); - await backend.close(); - await broker.close(); + await client.close(); // #endregion production-signing-shutdown } diff --git a/packages/stem/example/docs_snippets/lib/queue_events.dart b/packages/stem/example/docs_snippets/lib/queue_events.dart index 8b59f72c..31c2f3d2 100644 --- a/packages/stem/example/docs_snippets/lib/queue_events.dart +++ b/packages/stem/example/docs_snippets/lib/queue_events.dart @@ -1,5 +1,5 @@ // Queue custom event examples for documentation. -// ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print +// ignore_for_file: avoid_print import 'dart:async'; @@ -16,14 +16,17 @@ Future queueEventsProducerListener(Broker broker) async { await listener.start(); final subscription = listener.on('order.created').listen((event) { - print('Order created: ${event.payload['orderId']}'); + final created = event.payloadJson<_OrderCreatedEvent>( + decode: _OrderCreatedEvent.fromJson, + ); + print('Order created: ${created.orderId}'); print('Trace id: ${event.headers['x-trace-id']}'); }); - await producer.emit( + await producer.emitJson( 'orders', 'order.created', - payload: const {'orderId': 'ord-1001'}, + const _OrderCreatedEvent(orderId: 'ord-1001'), headers: const {'x-trace-id': 'trace-123'}, meta: const {'tenant': 'acme'}, ); @@ -51,13 +54,23 @@ Future queueEventsFanout(Broker broker) async { await listenerB.start(); final subscriptionA = listenerA.events.listen((event) { - print('A saw ${event.name}'); + final updated = event.payloadJson<_OrderUpdatedEvent>( + decode: _OrderUpdatedEvent.fromJson, + ); + print('A saw ${event.name} for ${updated.id}'); }); final subscriptionB = listenerB.events.listen((event) { - print('B saw ${event.name}'); + final updated = event.payloadJson<_OrderUpdatedEvent>( + decode: _OrderUpdatedEvent.fromJson, + ); + print('B saw ${event.name} for ${updated.id}'); }); - await producer.emit('orders', 'order.updated', payload: const {'id': 'o-1'}); + await producer.emitJson( + 'orders', + 'order.updated', + const _OrderUpdatedEvent(id: 'o-1'), + ); await Future.delayed(const Duration(milliseconds: 200)); await subscriptionA.cancel(); @@ -67,3 +80,27 @@ Future queueEventsFanout(Broker broker) async { } // #endregion queue-events-fanout + +class _OrderCreatedEvent { + const _OrderCreatedEvent({required this.orderId}); + + factory _OrderCreatedEvent.fromJson(Map json) { + return _OrderCreatedEvent(orderId: json['orderId'] as String); + } + + final String orderId; + + Map toJson() => {'orderId': orderId}; +} + +class _OrderUpdatedEvent { + const _OrderUpdatedEvent({required this.id}); + + factory _OrderUpdatedEvent.fromJson(Map json) { + return _OrderUpdatedEvent(id: json['id'] as String); + } + + final String id; + + Map toJson() => {'id': id}; +} diff --git a/packages/stem/example/docs_snippets/lib/quick_start.dart b/packages/stem/example/docs_snippets/lib/quick_start.dart index 8253ed70..23301649 100644 --- a/packages/stem/example/docs_snippets/lib/quick_start.dart +++ b/packages/stem/example/docs_snippets/lib/quick_start.dart @@ -64,24 +64,22 @@ Future main() async { concurrency: 4, ), ); - - unawaited(app.start()); - - final stem = app.stem; // #endregion quickstart-bootstrap // #region quickstart-enqueue - final resizeId = await stem.enqueue( + final resizeId = await app.enqueue( 'media.resize', args: {'file': 'report.png'}, ); - final emailId = await stem.enqueue( + final emailId = await app.enqueue( 'billing.email-receipt', args: {'to': 'alice@example.com'}, options: const TaskOptions(priority: 10), - notBefore: DateTime.now().add(const Duration(seconds: 5)), meta: {'orderId': 4242}, + enqueueOptions: const TaskEnqueueOptions( + countdown: Duration(seconds: 5), + ), ); print('Enqueued tasks: resize=$resizeId email=$emailId'); @@ -94,7 +92,7 @@ Future main() async { // #region quickstart-inspect await Future.delayed(const Duration(seconds: 6)); - final resizeStatus = await app.backend.get(resizeId); + final resizeStatus = await app.getTaskStatus(resizeId); print('Resize status: ${resizeStatus?.state} (${resizeStatus?.attempt})'); await app.close(); diff --git a/packages/stem/example/docs_snippets/lib/quick_start_failure.dart b/packages/stem/example/docs_snippets/lib/quick_start_failure.dart index e6d51906..2c8f2746 100644 --- a/packages/stem/example/docs_snippets/lib/quick_start_failure.dart +++ b/packages/stem/example/docs_snippets/lib/quick_start_failure.dart @@ -28,13 +28,12 @@ class EmailReceiptTask extends TaskHandler { Future main() async { final app = await StemApp.inMemory(tasks: [EmailReceiptTask()]); - await app.start(); - final taskId = await app.stem.enqueue( + final taskId = await app.enqueue( 'billing.email-receipt', args: {'to': 'demo@example.com'}, ); - final result = await app.stem.waitForTask( + final result = await app.waitForTask( taskId, timeout: const Duration(seconds: 5), ); diff --git a/packages/stem/example/docs_snippets/lib/rate_limiting.dart b/packages/stem/example/docs_snippets/lib/rate_limiting.dart index 701ec7c3..8d602c13 100644 --- a/packages/stem/example/docs_snippets/lib/rate_limiting.dart +++ b/packages/stem/example/docs_snippets/lib/rate_limiting.dart @@ -92,7 +92,7 @@ class GroupRateLimitedTask extends TaskHandler { // #endregion rate-limit-group-task-options // #region rate-limit-producer -Future enqueueRateLimited(Stem stem) async { +Future enqueueRateLimited(TaskEnqueuer stem) async { return stem.enqueue( 'demo.rateLimited', args: {'actor': 'acme'}, @@ -116,12 +116,8 @@ Future main() async { ); // #endregion rate-limit-demo-registry - // #region rate-limit-demo-worker-start - await app.start(); - // #endregion rate-limit-demo-worker-start - // #region rate-limit-demo-stem - final stem = app.stem; + final stem = app; // #endregion rate-limit-demo-stem // #region rate-limit-demo-enqueue await enqueueRateLimited(stem); diff --git a/packages/stem/example/docs_snippets/lib/retry_backoff.dart b/packages/stem/example/docs_snippets/lib/retry_backoff.dart index 6b2ae3c6..d72e6783 100644 --- a/packages/stem/example/docs_snippets/lib/retry_backoff.dart +++ b/packages/stem/example/docs_snippets/lib/retry_backoff.dart @@ -6,8 +6,10 @@ import 'dart:async'; import 'package:stem/stem.dart'; class FlakyTask extends TaskHandler { + static final definition = TaskDefinition.noArgs(name: 'demo.flaky'); + @override - String get name => 'demo.flaky'; + String get name => definition.name; // #region retry-backoff-task-options @override @@ -64,10 +66,9 @@ Future main() async { tasks: [FlakyTask()], workerConfig: workerConfig, ); - await app.start(); - final taskId = await app.stem.enqueue('demo.flaky'); - await app.stem.waitForTask(taskId, timeout: const Duration(seconds: 5)); + final taskId = await FlakyTask.definition.enqueue(app); + await app.waitForTask(taskId, timeout: const Duration(seconds: 5)); await app.close(); } diff --git a/packages/stem/example/docs_snippets/lib/routing.dart b/packages/stem/example/docs_snippets/lib/routing.dart index 4a0e8181..ff864448 100644 --- a/packages/stem/example/docs_snippets/lib/routing.dart +++ b/packages/stem/example/docs_snippets/lib/routing.dart @@ -41,7 +41,7 @@ final priorityRegistry = RoutingRegistry( // #endregion routing-priority-range // #region routing-bootstrap -Future<(Stem, Worker)> bootstrapStem() async { +Future<(StemClient, Worker)> bootstrapStem() async { final routing = await loadRouting(); final tasks = [EmailTask()]; final config = StemConfig.fromEnvironment(); @@ -52,21 +52,23 @@ Future<(Stem, Worker)> bootstrapStem() async { broadcastChannels: config.workerBroadcasts, ); - final stem = Stem( - broker: await RedisStreamsBroker.connect('redis://localhost:6379'), - backend: InMemoryResultBackend(), + final client = await StemClient.create( + broker: StemBrokerFactory( + create: () => RedisStreamsBroker.connect('redis://localhost:6379'), + dispose: (broker) => broker.close(), + ), + backend: StemBackendFactory.inMemory(), tasks: tasks, routing: routing, ); - final worker = Worker( - broker: await RedisStreamsBroker.connect('redis://localhost:6379'), - backend: InMemoryResultBackend(), - tasks: tasks, - subscription: subscription, + final worker = await client.createWorker( + workerConfig: StemWorkerConfig( + subscription: subscription, + ), ); - return (stem, worker); + return (client, worker); } class EmailTask extends TaskHandler { diff --git a/packages/stem/example/docs_snippets/lib/signals.dart b/packages/stem/example/docs_snippets/lib/signals.dart index 97f7667f..e0b50328 100644 --- a/packages/stem/example/docs_snippets/lib/signals.dart +++ b/packages/stem/example/docs_snippets/lib/signals.dart @@ -5,6 +5,10 @@ import 'dart:async'; import 'package:stem/stem.dart'; +final signalsDemoTaskDefinition = TaskDefinition.noArgs( + name: 'signals.demo', +); + // #region signals-configure void configureSignals() { StemSignals.configure( @@ -59,7 +63,7 @@ List registerWorkerScopedSignals() { 'Task failed on worker ${payload.worker.id}: ${payload.taskName}', ); }, - taskName: 'signals.demo', + taskName: signalsDemoTaskDefinition.name, workerId: 'signals-worker', ), StemSignals.onControlCommandCompleted( @@ -141,7 +145,7 @@ Future main() async { 'memory://', tasks: [ FunctionTaskHandler( - name: 'signals.demo', + name: signalsDemoTaskDefinition.name, entrypoint: (context, args) async { print('Signals demo task'); return null; @@ -154,8 +158,7 @@ Future main() async { ), ); - unawaited(app.start()); - await app.stem.enqueue('signals.demo', args: const {}); + await signalsDemoTaskDefinition.enqueue(app); await Future.delayed(const Duration(milliseconds: 200)); await app.close(); diff --git a/packages/stem/example/docs_snippets/lib/signing.dart b/packages/stem/example/docs_snippets/lib/signing.dart index 8a808bb3..d8b5218a 100644 --- a/packages/stem/example/docs_snippets/lib/signing.dart +++ b/packages/stem/example/docs_snippets/lib/signing.dart @@ -1,4 +1,5 @@ // Signing examples for documentation. +// These snippets intentionally show the lower-level producer/worker/Beat wiring. // ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print import 'dart:async'; @@ -98,8 +99,8 @@ void logActiveSigningKey() { // #endregion signing-rotation-producer-active-key // #region signing-rotation-producer-enqueue -Future enqueueDuringRotation(Stem stem) async { - await stem.enqueue( +Future enqueueDuringRotation(TaskEnqueuer enqueuer) async { + await enqueuer.enqueue( 'billing.charge', args: {'customerId': 'cust_123', 'amount': 4200}, ); diff --git a/packages/stem/example/docs_snippets/lib/tasks.dart b/packages/stem/example/docs_snippets/lib/tasks.dart index 29a289b7..bd9caae4 100644 --- a/packages/stem/example/docs_snippets/lib/tasks.dart +++ b/packages/stem/example/docs_snippets/lib/tasks.dart @@ -15,7 +15,7 @@ class EmailTask extends TaskHandler { @override Future call(TaskContext context, Map args) async { - final to = args['to'] as String? ?? 'anonymous'; + final to = context.argOr('to', 'anonymous'); print('Emailing $to (attempt ${context.attempt})'); } } @@ -50,13 +50,20 @@ final redisTasks = [RedisEmailTask()]; class InvoicePayload { const InvoicePayload({required this.invoiceId}); final String invoiceId; + + Map toJson() => {'invoiceId': invoiceId}; + + factory InvoicePayload.fromJson(Map json) { + return InvoicePayload(invoiceId: json['invoiceId']! as String); + } } class PublishInvoiceTask extends TaskHandler { - static final definition = TaskDefinition( + static final definition = TaskDefinition.json( name: 'invoice.publish', - encodeArgs: (payload) => {'invoiceId': payload.invoiceId}, - metadata: const TaskMetadata(description: 'Publishes invoices downstream'), + metadata: const TaskMetadata( + description: 'Publishes invoices downstream', + ), defaultOptions: const TaskOptions(queue: 'billing'), ); @@ -68,34 +75,31 @@ class PublishInvoiceTask extends TaskHandler { @override Future call(TaskContext context, Map args) async { - final invoiceId = args['invoiceId'] as String; + final invoiceId = context.requiredArg('invoiceId'); await publishInvoice(invoiceId); } } Future runTypedDefinitionExample() async { - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); - final stem = Stem( - broker: broker, - backend: backend, + final client = await StemClient.inMemory( tasks: [PublishInvoiceTask()], ); + final app = await client.createApp(); - final taskId = await stem.enqueueCall( - PublishInvoiceTask.definition(const InvoicePayload(invoiceId: 'inv_42')), + final result = await PublishInvoiceTask.definition.enqueueAndWait( + app, + const InvoicePayload(invoiceId: 'inv_42'), ); - final result = await stem.waitForTask(taskId); if (result?.isSucceeded == true) { print('Invoice published'); } - await backend.close(); - await broker.close(); + await app.close(); + await client.close(); } // #endregion tasks-typed-definition // #region tasks-context-enqueue -Future enqueueFromContext(TaskContext context) async { +Future enqueueFromContext(TaskExecutionContext context) async { await context.enqueue( 'tasks.child', args: {'id': '123'}, @@ -127,27 +131,20 @@ final childDefinition = TaskDefinition( ); // #region tasks-invocation-builder -Future enqueueWithBuilder(TaskInvocationContext invocation) async { - final call = invocation - .enqueueBuilder( - definition: childDefinition, - args: const ChildArgs('value'), - ) - .queue('critical') - .priority(9) - .delay(const Duration(seconds: 5)) - .enqueueOptions( - const TaskEnqueueOptions( - retry: true, - retryPolicy: TaskRetryPolicy( - backoff: true, - defaultDelay: Duration(seconds: 1), - ), - ), - ) - .build(); - - await invocation.enqueueCall(call); +Future enqueueWithBuilder(TaskExecutionContext context) async { + final call = childDefinition.buildCall( + const ChildArgs('value'), + options: const TaskOptions(queue: 'critical', priority: 9), + notBefore: DateTime.now().add(const Duration(seconds: 5)), + enqueueOptions: const TaskEnqueueOptions( + retry: true, + retryPolicy: TaskRetryPolicy( + backoff: true, + defaultDelay: Duration(seconds: 1), + ), + ), + ); + await context.enqueueCall(call); } // #endregion tasks-invocation-builder @@ -173,13 +170,15 @@ class Base64PayloadEncoder extends TaskPayloadEncoder { } Future configureEncoders() async { - final app = await StemApp.inMemory( + final client = await StemClient.inMemory( tasks: [EmailTask()], argsEncoder: const Base64PayloadEncoder(), resultEncoder: const Base64PayloadEncoder(), additionalEncoders: const [MyOtherEncoder()], ); + final app = await client.createApp(); await app.close(); + await client.close(); } // #endregion tasks-encoders-global @@ -222,18 +221,19 @@ class MyOtherEncoder extends TaskPayloadEncoder { } Future main() async { - final app = await StemApp.inMemory(tasks: [EmailTask()]); - await app.start(); + final client = await StemClient.inMemory(tasks: [EmailTask()]); + final app = await client.createApp(); - final taskId = await app.stem.enqueue( + final taskId = await app.enqueue( 'email.send', args: {'to': 'demo@example.com'}, ); - final result = await app.stem.waitForTask( + final result = await app.waitForTask( taskId, timeout: const Duration(seconds: 5), ); print('Email task state: ${result?.status.state}'); await app.close(); + await client.close(); } diff --git a/packages/stem/example/docs_snippets/lib/troubleshooting.dart b/packages/stem/example/docs_snippets/lib/troubleshooting.dart index 01ac55c7..3e775ecc 100644 --- a/packages/stem/example/docs_snippets/lib/troubleshooting.dart +++ b/packages/stem/example/docs_snippets/lib/troubleshooting.dart @@ -32,11 +32,10 @@ Future runTroubleshootingDemo() async { concurrency: 1, ), ); - unawaited(app.start()); // #endregion troubleshooting-bootstrap // #region troubleshooting-enqueue - final taskId = await app.stem.enqueue( + final taskId = await app.enqueue( 'debug.echo', args: {'message': 'troubleshooting'}, ); @@ -44,7 +43,7 @@ Future runTroubleshootingDemo() async { // #region troubleshooting-results await Future.delayed(const Duration(milliseconds: 200)); - final result = await app.backend.get(taskId); + final result = await app.getTaskStatus(taskId); print('Result: ${result?.payload}'); // #endregion troubleshooting-results diff --git a/packages/stem/example/docs_snippets/lib/uniqueness.dart b/packages/stem/example/docs_snippets/lib/uniqueness.dart index a0a18bec..36473579 100644 --- a/packages/stem/example/docs_snippets/lib/uniqueness.dart +++ b/packages/stem/example/docs_snippets/lib/uniqueness.dart @@ -59,39 +59,29 @@ Future buildRedisCoordinator() async { // #endregion uniqueness-coordinator-redis // #region uniqueness-enqueue -Future enqueueDigest(Stem stem) async { - final firstId = await stem.enqueue( +Future enqueueDigest(TaskEnqueuer enqueuer) async { + final firstId = await enqueuer.enqueue( 'email.sendDigest', args: const {'userId': 42}, - options: const TaskOptions( - queue: 'email', - unique: true, - uniqueFor: Duration(minutes: 15), - ), ); - final secondId = await stem.enqueue( + final secondId = await enqueuer.enqueue( 'email.sendDigest', args: const {'userId': 42}, - options: const TaskOptions( - queue: 'email', - unique: true, - uniqueFor: Duration(minutes: 15), - ), ); print('first enqueue id: $firstId'); print('second enqueue id: $secondId (dup is re-used)'); + return firstId; } // #endregion uniqueness-enqueue // #region uniqueness-override-key -Future enqueueWithOverride(Stem stem) async { - await stem.enqueue( - 'orders.sync', - args: const {'id': 42}, - options: const TaskOptions(unique: true, uniqueFor: Duration(minutes: 10)), - meta: const {UniqueTaskMetadata.override: 'order-42'}, +Future enqueueWithOverride(TaskEnqueuer enqueuer) async { + return enqueuer.enqueue( + 'email.sendDigest', + args: const {'userId': 42}, + meta: const {UniqueTaskMetadata.override: 'digest-override-42'}, ); } // #endregion uniqueness-override-key @@ -110,12 +100,16 @@ Future main() async { ); // #endregion uniqueness-stem-worker - unawaited(app.start()); - - await enqueueDigest(app.stem); - await enqueueWithOverride(app.stem); - - await Future.delayed(const Duration(milliseconds: 500)); + final digestTaskId = await enqueueDigest(app); + await app.waitForTask( + digestTaskId, + timeout: const Duration(seconds: 1), + ); + final overrideTaskId = await enqueueWithOverride(app); + await app.waitForTask( + overrideTaskId, + timeout: const Duration(seconds: 1), + ); await app.close(); } diff --git a/packages/stem/example/docs_snippets/lib/workers_programmatic.dart b/packages/stem/example/docs_snippets/lib/workers_programmatic.dart index d47dc4d8..8123dcc3 100644 --- a/packages/stem/example/docs_snippets/lib/workers_programmatic.dart +++ b/packages/stem/example/docs_snippets/lib/workers_programmatic.dart @@ -1,4 +1,5 @@ // Programmatic worker and producer examples for documentation. +// These snippets intentionally cover the lower-level embedding surface. // ignore_for_file: unused_local_variable, unused_import, dead_code, avoid_print import 'dart:async'; @@ -9,33 +10,27 @@ import 'package:stem_redis/stem_redis.dart'; // #region workers-producer-minimal Future minimalProducer() async { - final tasks = [ - FunctionTaskHandler( - name: 'email.send', - entrypoint: (context, args) async { - final to = args['to'] as String? ?? 'friend'; - print('Queued email to $to'); - return null; - }, - ), - ]; - - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); - final stem = Stem( - broker: broker, - backend: backend, - tasks: tasks, + final app = await StemApp.inMemory( + tasks: [ + FunctionTaskHandler( + name: 'email.send', + entrypoint: (context, args) async { + final to = args['to'] as String? ?? 'friend'; + print('Queued email to $to'); + return null; + }, + ), + ], ); - final taskId = await stem.enqueue( + final taskId = await app.enqueue( 'email.send', args: {'to': 'hello@example.com', 'subject': 'Welcome'}, ); print('Enqueued $taskId'); - await backend.close(); - await broker.close(); + await app.waitForTask(taskId); + await app.close(); } // #endregion workers-producer-minimal @@ -43,32 +38,28 @@ Future minimalProducer() async { Future redisProducer() async { final brokerUrl = Platform.environment['STEM_BROKER_URL'] ?? 'redis://localhost:6379'; - final broker = await RedisStreamsBroker.connect(brokerUrl); - final backend = await RedisResultBackend.connect('$brokerUrl/1'); - final tasks = [ - FunctionTaskHandler( - name: 'report.generate', - entrypoint: (context, args) async { - final id = args['reportId'] as String? ?? 'unknown'; - print('Queued report $id'); - return null; - }, - ), - ]; - - final stem = Stem( - broker: broker, - backend: backend, - tasks: tasks, + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides(backend: '$brokerUrl/1'), + tasks: [ + FunctionTaskHandler( + name: 'report.generate', + entrypoint: (context, args) async { + final id = args['reportId'] as String? ?? 'unknown'; + print('Queued report $id'); + return null; + }, + ), + ], ); - await stem.enqueue( + await client.enqueue( 'report.generate', args: {'reportId': 'monthly-2025-10'}, options: const TaskOptions(queue: 'reports'), ); - await backend.close(); - await broker.close(); + await client.close(); } // #endregion workers-producer-redis @@ -76,35 +67,28 @@ Future redisProducer() async { Future signedProducer() async { final config = StemConfig.fromEnvironment(); final signer = PayloadSigner.maybe(config.signing); - final tasks = [ - FunctionTaskHandler( - name: 'billing.charge', - entrypoint: (context, args) async { - final customerId = args['customerId'] as String? ?? 'unknown'; - print('Queued charge for $customerId'); - return null; - }, - ), - ]; - - final broker = await RedisStreamsBroker.connect( + final client = await StemClient.fromUrl( config.brokerUrl, - tls: config.tls, - ); - final backend = InMemoryResultBackend(); - final stem = Stem( - broker: broker, - backend: backend, - tasks: tasks, + adapters: const [StemRedisAdapter()], + overrides: const StemStoreOverrides(backend: 'memory://'), + tasks: [ + FunctionTaskHandler( + name: 'billing.charge', + entrypoint: (context, args) async { + final customerId = args['customerId'] as String? ?? 'unknown'; + print('Queued charge for $customerId'); + return null; + }, + ), + ], signer: signer, ); - await stem.enqueue( + await client.enqueue( 'billing.charge', args: {'customerId': 'cust_123', 'amount': 4200}, ); - await backend.close(); - await broker.close(); + await client.close(); } // #endregion workers-producer-signed diff --git a/packages/stem/example/docs_snippets/lib/workflows.dart b/packages/stem/example/docs_snippets/lib/workflows.dart index 224d68a5..fd0f104d 100644 --- a/packages/stem/example/docs_snippets/lib/workflows.dart +++ b/packages/stem/example/docs_snippets/lib/workflows.dart @@ -5,16 +5,54 @@ import 'package:stem/stem.dart'; import 'package:stem_postgres/stem_postgres.dart'; import 'package:stem_redis/stem_redis.dart'; +class ApprovalDraft { + const ApprovalDraft({required this.documentId}); + + final String documentId; + + Map toJson() => {'documentId': documentId}; + + factory ApprovalDraft.fromJson(Map json) { + return ApprovalDraft(documentId: json['documentId'] as String); + } +} + +class ApprovalDecision { + const ApprovalDecision({required this.approvedBy}); + + final String approvedBy; + + Map toJson() => {'approvedBy': approvedBy}; + + factory ApprovalDecision.fromJson(Map json) { + return ApprovalDecision(approvedBy: json['approvedBy'] as String); + } +} + +class ChargePrepared { + const ChargePrepared({required this.chargeId}); + + final String chargeId; + + Map toJson() => {'chargeId': chargeId}; + + factory ChargePrepared.fromJson(Map json) { + return ChargePrepared(chargeId: json['chargeId'] as String); + } +} + // #region workflows-runtime -Future bootstrapWorkflowRuntime() async { +Future bootstrapWorkflowApp() async { // #region workflows-app-create - final workflowApp = await StemWorkflowApp.fromUrl( + final client = await StemClient.fromUrl( 'redis://127.0.0.1:56379', adapters: const [StemRedisAdapter(), StemPostgresAdapter()], overrides: const StemStoreOverrides( backend: 'redis://127.0.0.1:56379/1', workflow: 'postgresql://:@127.0.0.1:65432/stem', ), + ); + final workflowApp = await client.createWorkflowApp( flows: [ApprovalsFlow.flow], scripts: [retryScript], eventBusFactory: WorkflowEventBusFactory.inMemory(), @@ -30,9 +68,8 @@ Future bootstrapWorkflowRuntime() async { // #region workflows-client Future bootstrapWorkflowClient() async { - final client = await StemClient.fromUrl('memory://'); - final app = await client.createWorkflowApp(module: stemModule); - await app.start(); + final client = await StemClient.fromUrl('memory://', module: stemModule); + final app = await client.createWorkflowApp(); await app.close(); await client.close(); } @@ -44,29 +81,40 @@ class ApprovalsFlow { name: 'approvals.flow', build: (flow) { flow.step('draft', (ctx) async { - final payload = ctx.params['draft'] as Map; - return payload['documentId']; + final draft = ctx.requiredParamJson( + 'draft', + decode: ApprovalDraft.fromJson, + ); + return draft.documentId; }); flow.step('manager-review', (ctx) async { - final resume = ctx.takeResumeValue>(); + final resume = ctx.waitForEventValueJson( + 'approvals.manager', + decode: ApprovalDecision.fromJson, + ); if (resume == null) { - await ctx.awaitEvent('approvals.manager'); return null; } - return resume['approvedBy'] as String?; + return resume.approvedBy; }); flow.step('finalize', (ctx) async { - final approvedBy = ctx.previousResult as String?; + final approvedBy = ctx.previousValue(); return 'approved-by:$approvedBy'; }); }, ); + + static final ref = flow.refJson(); } Future registerFlow(StemWorkflowApp workflowApp) async { - workflowApp.runtime.registerWorkflow(ApprovalsFlow.flow.definition); + workflowApp.registerFlows([ApprovalsFlow.flow]); +} + +Future registerWorkflowDefinition(StemWorkflowApp workflowApp) async { + workflowApp.registerWorkflows([ApprovalsFlow.flow.definition]); } // #endregion workflows-flow @@ -75,12 +123,14 @@ final retryScript = WorkflowScript( name: 'billing.retry-script', run: (script) async { final chargeId = await script.step('charge', (ctx) async { - final resume = ctx.takeResumeValue>(); + final resume = ctx.waitForEventValueJson( + 'billing.charge.prepared', + decode: ChargePrepared.fromJson, + ); if (resume == null) { - await ctx.awaitEvent('billing.charge.prepared'); return 'pending'; } - return resume['chargeId'] as String; + return resume.chargeId; }); final receipt = await script.step('confirm', (ctx) async { @@ -93,28 +143,31 @@ final retryScript = WorkflowScript( ); final retryDefinition = retryScript.definition; + +Future registerScript(StemWorkflowApp workflowApp) async { + workflowApp.registerScripts([retryScript]); +} // #endregion workflows-script // #region workflows-run Future runWorkflow(StemWorkflowApp workflowApp) async { - final runId = await workflowApp.startWorkflow( - 'approvals.flow', - params: { - 'draft': {'documentId': 'doc-42'}, - }, + final runId = await ApprovalsFlow.ref.start( + workflowApp, + params: const ApprovalDraft(documentId: 'doc-42'), cancellationPolicy: const WorkflowCancellationPolicy( maxRunDuration: Duration(hours: 2), maxSuspendDuration: Duration(minutes: 30), ), ); - final result = await workflowApp.waitForCompletion( + final result = await ApprovalsFlow.ref.waitFor( + workflowApp, runId, timeout: const Duration(minutes: 5), ); if (result?.isCompleted == true) { - print('Workflow finished with ${result!.value}'); + print('Workflow finished with ${result!.requiredValue()}'); } else { print('Workflow state: ${result?.status}'); } @@ -128,14 +181,14 @@ final encoders = TaskPayloadEncoderRegistry( ); Future configureWorkflowEncoders() async { - final app = await StemWorkflowApp.fromUrl( - 'memory://', - flows: [ApprovalsFlow.flow], + final client = await StemClient.inMemory( encoderRegistry: encoders, additionalEncoders: const [GzipPayloadEncoder()], ); + final app = await client.createWorkflowApp(flows: [ApprovalsFlow.flow]); await app.close(); + await client.close(); } // #endregion workflows-encoders @@ -143,39 +196,49 @@ Future configureWorkflowEncoders() async { @WorkflowDefn(name: 'approvals.flow') class ApprovalsAnnotatedWorkflow { @WorkflowStep() - Future draft(FlowContext ctx) async { - final payload = ctx.params['draft'] as Map; - return payload['documentId'] as String; + Future draft({FlowContext? context}) async { + final ctx = context!; + final draft = ctx.requiredParamJson( + 'draft', + decode: ApprovalDraft.fromJson, + ); + return draft.documentId; } @WorkflowStep(name: 'manager-review') - Future managerReview(FlowContext ctx) async { - final resume = ctx.takeResumeValue>(); + Future managerReview({FlowContext? context}) async { + final ctx = context!; + final resume = ctx.waitForEventValueJson( + 'approvals.manager', + decode: ApprovalDecision.fromJson, + ); if (resume == null) { - await ctx.awaitEvent('approvals.manager'); return null; } - return resume['approvedBy'] as String?; + return resume.approvedBy; } @WorkflowStep() - Future finalize(FlowContext ctx) async { - final approvedBy = ctx.previousResult as String?; + Future finalize({FlowContext? context}) async { + final ctx = context!; + final approvedBy = ctx.previousValue(); return 'approved-by:$approvedBy'; } } @WorkflowDefn(name: 'billing.retry-script', kind: WorkflowKind.script) class BillingRetryAnnotatedWorkflow { - @WorkflowRun() - Future run(WorkflowScriptContext script) async { + Future run({WorkflowScriptContext? context}) async { + final script = context!; final chargeId = await script.step('charge', (ctx) async { - final resume = ctx.takeResumeValue>(); + final resume = ctx.waitForEventValueJson( + 'billing.charge.prepared', + decode: ChargePrepared.fromJson, + ); if (resume == null) { - await ctx.awaitEvent('billing.charge.prepared'); return 'pending'; } - return resume['chargeId'] as String; + return resume.chargeId; }); return script.step('confirm', (ctx) async { @@ -190,18 +253,17 @@ class BillingRetryAnnotatedWorkflow { options: TaskOptions(maxRetries: 5), ) Future sendEmail( - TaskInvocationContext ctx, - Map args, -) async { + Map args, { + TaskInvocationContext? context, +}) async { + final ctx = context!; + ctx.heartbeat(); // send email } Future registerAnnotatedDefinitions(StemWorkflowApp app) async { // Generated by stem_builder. - stemModule.registerInto( - workflows: app.runtime.registry, - tasks: app.app.registry, - ); + app.registerModule(stemModule); } // #endregion workflows-annotated @@ -236,15 +298,17 @@ Future main() async { }, ); - final app = await StemWorkflowApp.inMemory(flows: [demoFlow]); - await app.start(); + final client = await StemClient.inMemory(); + final app = await client.createWorkflowApp(flows: [demoFlow]); - final runId = await app.startWorkflow('demo.flow'); - final result = await app.waitForCompletion( + final runId = await demoFlow.start(app); + final result = await demoFlow.waitFor( + app, runId, timeout: const Duration(seconds: 5), ); print('Workflow result: ${result?.status} value=${result?.value}'); await app.close(); + await client.close(); } diff --git a/packages/stem/example/durable_watchers.dart b/packages/stem/example/durable_watchers.dart index 76114fe7..5098aab5 100644 --- a/packages/stem/example/durable_watchers.dart +++ b/packages/stem/example/durable_watchers.dart @@ -1,72 +1,72 @@ import 'package:stem/stem.dart'; -final shipmentReadyEventCodec = PayloadCodec<_ShipmentReadyEvent>( - encode: (value) => value.toJson(), +final shipmentReadyEvent = WorkflowEventRef<_ShipmentReadyEvent>.json( + topic: 'shipment.ready', decode: _ShipmentReadyEvent.fromJson, + typeName: '_ShipmentReadyEvent', ); /// Runs a workflow that suspends on `awaitEvent` and resumes once a payload is /// emitted. The example also inspects watcher metadata before the resume. Future main() async { - final app = await StemWorkflowApp.inMemory( - scripts: [ - WorkflowScript( - name: 'shipment.workflow', - run: (script) async { - await script.step('prepare', (step) async { - final orderId = step.params['orderId']; - return 'prepared-$orderId'; - }); + final shipmentWorkflow = WorkflowScript( + name: 'shipment.workflow', + run: (script) async { + await script.step('prepare', (step) async { + final orderId = step.params.requiredValue('orderId'); + return 'prepared-$orderId'; + }); - final trackingId = await script.step('wait-for-shipment', ( - step, - ) async { - final payload = step.takeResumeValue<_ShipmentReadyEvent>( - codec: shipmentReadyEventCodec, - ); - if (payload == null) { - await step.awaitEvent( - 'shipment.ready', - deadline: DateTime.now().add(const Duration(minutes: 5)), - data: const {'reason': 'waiting-for-carrier'}, - ); - return 'waiting'; - } - return payload.trackingId; - }); + final trackingId = await script.step('wait-for-shipment', (step) async { + final payload = await shipmentReadyEvent.wait( + step, + deadline: DateTime.now().add(const Duration(minutes: 5)), + data: const {'reason': 'waiting-for-carrier'}, + ); + return payload.trackingId; + }); - return trackingId; - }, - ), - ], + return trackingId; + }, + ); + final shipmentWorkflowRef = shipmentWorkflow.ref>( + encodeParams: (params) => params, + ); + final app = await StemWorkflowApp.inMemory( + scripts: [shipmentWorkflow], ); - final runId = await app.startWorkflow( - 'shipment.workflow', + final runId = await shipmentWorkflowRef.start( + app, params: const {'orderId': 'A-123'}, ); // Drive the run until it suspends on the watcher. - await app.runtime.executeRun(runId); + await app.executeRun(runId); - final watchers = await app.store.listWatchers('shipment.ready'); + final watchers = await app.listWatchers(shipmentReadyEvent.topic); for (final watcher in watchers) { print( 'Run ${watcher.runId} waiting on ${watcher.topic} (step ${watcher.stepName})', ); + final payload = watcher.payloadJson<_ShipmentReadyEvent>( + decode: _ShipmentReadyEvent.fromJson, + ); print('Watcher metadata: ${watcher.data}'); + if (payload != null) { + print('Watcher payload DTO: ${payload.trackingId}'); + } } - await app.emitValue( - 'shipment.ready', + await shipmentReadyEvent.emit( + app, const _ShipmentReadyEvent(trackingId: 'ZX-42'), - codec: shipmentReadyEventCodec, ); - await app.runtime.executeRun(runId); + await app.executeRun(runId); - final completed = await app.store.get(runId); - print('Workflow completed with result: ${completed?.result}'); + final completed = await shipmentWorkflowRef.waitFor(app, runId); + print('Workflow completed with result: ${completed?.value}'); await app.close(); } @@ -74,12 +74,11 @@ Future main() async { class _ShipmentReadyEvent { const _ShipmentReadyEvent({required this.trackingId}); - final String trackingId; - - Map toJson() => {'trackingId': trackingId}; - - static _ShipmentReadyEvent fromJson(Object? payload) { - final json = payload! as Map; + factory _ShipmentReadyEvent.fromJson(Map json) { return _ShipmentReadyEvent(trackingId: json['trackingId'] as String); } + + final String trackingId; + + Map toJson() => {'trackingId': trackingId}; } diff --git a/packages/stem/example/ecommerce/README.md b/packages/stem/example/ecommerce/README.md index 17432061..af8027a7 100644 --- a/packages/stem/example/ecommerce/README.md +++ b/packages/stem/example/ecommerce/README.md @@ -35,8 +35,14 @@ From those annotations, this example uses generated APIs: - `stemModule` (generated workflow/task bundle) - `StemWorkflowDefinitions.addToCart` +- `StemWorkflowDefinitions.addToCart.startAndWait(...)` - `StemTaskDefinitions.ecommerceAuditLog` -- `TaskEnqueuer.enqueueEcommerceAuditLog(...)` +- direct task definition helpers like + `StemTaskDefinitions.ecommerceAuditLog.enqueue(...)` + +The manual checkout flow also derives a typed ref from its `Flow` definition: + +- `checkoutWorkflowRef(checkoutFlow)` The server wires generated and manual tasks together in one place: @@ -50,6 +56,12 @@ final workflowApp = await StemWorkflowApp.fromUrl( ); ``` +That bootstrap path auto-subscribes the worker to the workflow queue plus the +default queues declared on the bundled module tasks and +`shipmentReserveTaskHandler`. +You only need an explicit `workerConfig.subscription` if you route work to +additional queues beyond those task defaults. + This is why the run command always includes: ```bash diff --git a/packages/stem/example/ecommerce/lib/src/app.dart b/packages/stem/example/ecommerce/lib/src/app.dart index 4343953a..3d501be8 100644 --- a/packages/stem/example/ecommerce/lib/src/app.dart +++ b/packages/stem/example/ecommerce/lib/src/app.dart @@ -32,20 +32,19 @@ class EcommerceServer { ); final repository = await EcommerceRepository.open(commerceDatabasePath); bindAddToCartWorkflowRepository(repository); + final checkoutFlow = buildCheckoutFlow(repository); + final checkoutWorkflow = checkoutWorkflowRef(checkoutFlow); final workflowApp = await StemWorkflowApp.fromUrl( 'sqlite://$stemDatabasePath', adapters: const [StemSqliteAdapter()], module: stemModule, - flows: [buildCheckoutFlow(repository)], + flows: [checkoutFlow], tasks: [shipmentReserveTaskHandler], workerConfig: StemWorkerConfig( queue: 'workflow', consumerName: 'ecommerce-worker', concurrency: 2, - subscription: RoutingSubscription( - queues: const ['workflow', 'default'], - ), ), ); @@ -59,7 +58,7 @@ class EcommerceServer { 'stemDatabasePath': stemDatabasePath, 'workflows': [ StemWorkflowDefinitions.addToCart.name, - checkoutWorkflowName, + checkoutWorkflow.name, ], }); }) @@ -94,26 +93,22 @@ class EcommerceServer { final sku = payload['sku']?.toString() ?? ''; final quantity = _toInt(payload['quantity']); - final runId = await StemWorkflowDefinitions.addToCart - .call((cartId: cartId, sku: sku, quantity: quantity)) - .startWithApp(workflowApp); - - final result = await StemWorkflowDefinitions.addToCart.waitFor( + final result = await StemWorkflowDefinitions.addToCart.startAndWait( workflowApp, - runId, + params: (cartId: cartId, sku: sku, quantity: quantity), timeout: const Duration(seconds: 4), ); if (result == null) { return _error(500, 'Add-to-cart workflow run not found.', { - 'runId': runId, + 'runId': null, }); } if (result.status != WorkflowStatus.completed || result.value == null) { return _error(422, 'Add-to-cart workflow did not complete.', { - 'runId': runId, + 'runId': result.runId, 'status': result.status.name, 'lastError': result.state.lastError, }); @@ -128,24 +123,23 @@ class EcommerceServer { unitPriceCents: _toInt(computed['unitPriceCents']), ); - return _json(200, {'runId': runId, 'cart': updatedCart}); + return _json(200, {'runId': result.runId, 'cart': updatedCart}); } on Object catch (error) { return _error(400, 'Failed to add item to cart.', error); } }) ..post('/checkout/', (Request request, String cartId) async { try { - final runId = await workflowApp.startWorkflow( - checkoutWorkflowName, - params: {'cartId': cartId}, + final runId = await checkoutWorkflow.start( + workflowApp, + params: cartId, ); - final result = await workflowApp - .waitForCompletion>( - runId, - timeout: const Duration(seconds: 6), - decode: _toMap, - ); + final result = await checkoutWorkflow.waitFor( + workflowApp, + runId, + timeout: const Duration(seconds: 6), + ); if (result == null) { return _error(500, 'Checkout workflow run not found.', { @@ -175,7 +169,7 @@ class EcommerceServer { return _json(200, {'order': order}); }) ..get('/runs/', (Request request, String runId) async { - final detail = await workflowApp.runtime.viewRunDetail(runId); + final detail = await workflowApp.viewRunDetail(runId); if (detail == null) { return _error(404, 'Workflow run not found.', {'runId': runId}); } @@ -242,16 +236,6 @@ Response _error(int status, String message, Object? error) { return _json(status, {'error': message, 'details': _normalizeError(error)}); } -Map _toMap(Object? value) { - if (value is Map) { - return value; - } - if (value is Map) { - return value.cast(); - } - return {}; -} - Object? _normalizeError(Object? error) { if (error == null) { return null; diff --git a/packages/stem/example/ecommerce/lib/src/workflows/annotated_defs.stem.g.dart b/packages/stem/example/ecommerce/lib/src/workflows/annotated_defs.stem.g.dart index 65d4e4fa..e8ed64bf 100644 --- a/packages/stem/example/ecommerce/lib/src/workflows/annotated_defs.stem.g.dart +++ b/packages/stem/example/ecommerce/lib/src/workflows/annotated_defs.stem.g.dart @@ -37,15 +37,13 @@ final List _stemScripts = [ WorkflowScript( name: "ecommerce.cart.add_item", checkpoints: [ - FlowStep( + WorkflowCheckpoint( name: "validate-input", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "price-line-item", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), @@ -78,8 +76,6 @@ abstract final class StemWorkflowDefinitions { ); } -Future _stemScriptManifestStepNoop(FlowContext context) async => null; - Object? _stemRequireArg(Map args, String name) { if (!args.containsKey(name)) { throw ArgumentError('Missing required argument "$name".'); @@ -122,43 +118,6 @@ abstract final class StemTaskDefinitions { ); } -extension StemGeneratedTaskEnqueuer on TaskEnqueuer { - Future enqueueEcommerceAuditLog({ - required String event, - required String entityId, - required String detail, - Map headers = const {}, - TaskOptions? options, - DateTime? notBefore, - Map? meta, - TaskEnqueueOptions? enqueueOptions, - }) { - return enqueueCall( - StemTaskDefinitions.ecommerceAuditLog.call( - (event: event, entityId: entityId, detail: detail), - headers: headers, - options: options, - notBefore: notBefore, - meta: meta, - enqueueOptions: enqueueOptions, - ), - ); - } -} - -extension StemGeneratedTaskResults on Stem { - Future>?> waitForEcommerceAuditLog( - String taskId, { - Duration? timeout, - }) { - return waitForTaskDefinition( - taskId, - StemTaskDefinitions.ecommerceAuditLog, - timeout: timeout, - ); - } -} - final List> _stemTasks = >[ FunctionTaskHandler( name: "ecommerce.audit.log", diff --git a/packages/stem/example/ecommerce/lib/src/workflows/checkout_flow.dart b/packages/stem/example/ecommerce/lib/src/workflows/checkout_flow.dart index 1c342b4f..cee8c225 100644 --- a/packages/stem/example/ecommerce/lib/src/workflows/checkout_flow.dart +++ b/packages/stem/example/ecommerce/lib/src/workflows/checkout_flow.dart @@ -4,6 +4,14 @@ import '../domain/repository.dart'; const checkoutWorkflowName = 'ecommerce.checkout'; +WorkflowRef> checkoutWorkflowRef( + Flow> flow, +) { + return flow.ref( + encodeParams: (cartId) => {'cartId': cartId}, + ); +} + Flow> buildCheckoutFlow(EcommerceRepository repository) { return Flow>( name: checkoutWorkflowName, @@ -11,10 +19,7 @@ Flow> buildCheckoutFlow(EcommerceRepository repository) { metadata: const {'domain': 'commerce', 'surface': 'checkout'}, build: (flow) { flow.step('load-cart', (ctx) async { - final cartId = ctx.params['cartId']?.toString() ?? ''; - if (cartId.isEmpty) { - throw ArgumentError('Missing required cartId parameter.'); - } + final cartId = ctx.requiredParam('cartId'); final cart = await repository.getCart(cartId); if (cart == null) { @@ -24,25 +29,24 @@ Flow> buildCheckoutFlow(EcommerceRepository repository) { }); flow.step('capture-payment', (ctx) async { - final resume = ctx.takeResumeValue>(); - if (resume == null) { - ctx.sleep( - const Duration(milliseconds: 100), - data: { - 'phase': 'payment-authorization', - 'cartId': ctx.params['cartId'], - }, - ); + if (!ctx.sleepUntilResumed( + const Duration(milliseconds: 100), + data: { + 'phase': 'payment-authorization', + 'cartId': ctx.requiredParam('cartId'), + }, + )) { return null; } - final cartId = ctx.params['cartId']?.toString() ?? 'unknown-cart'; + final cartId = ctx.requiredParam('cartId'); return {'paymentReference': 'pay-$cartId'}; }); flow.step('create-order', (ctx) async { - final cartId = ctx.params['cartId']?.toString() ?? ''; - final paymentPayload = _mapFromDynamic(ctx.previousResult); + final cartId = ctx.requiredParam('cartId'); + final paymentPayload = ctx + .requiredPreviousValue>(); final paymentReference = paymentPayload['paymentReference']?.toString() ?? 'pay-$cartId'; @@ -54,54 +58,31 @@ Flow> buildCheckoutFlow(EcommerceRepository repository) { }); flow.step('emit-side-effects', (ctx) async { - final order = _mapFromDynamic(ctx.previousResult); - if (order.isEmpty) { - throw StateError( - 'create-order step did not return an order payload.', - ); - } + final order = ctx.requiredPreviousValue>(); final orderId = order['id']?.toString() ?? ''; final cartId = order['cartId']?.toString() ?? ''; - if (ctx.enqueuer != null) { - await ctx.enqueuer!.enqueue( - 'ecommerce.audit.log', - args: { - 'event': 'order.checked_out', - 'entityId': orderId, - 'detail': 'cart=$cartId', - }, - options: const TaskOptions(queue: 'default'), - meta: { - 'workflow': checkoutWorkflowName, - 'step': 'emit-side-effects', - }, - ); + await ctx.enqueue( + 'ecommerce.audit.log', + args: { + 'event': 'order.checked_out', + 'entityId': orderId, + 'detail': 'cart=$cartId', + }, + options: const TaskOptions(queue: 'default'), + meta: {'workflow': checkoutWorkflowName, 'step': 'emit-side-effects'}, + ); - await ctx.enqueuer!.enqueue( - 'ecommerce.shipping.reserve', - args: {'orderId': orderId, 'carrier': 'acme-post'}, - options: const TaskOptions(queue: 'default'), - meta: { - 'workflow': checkoutWorkflowName, - 'step': 'emit-side-effects', - }, - ); - } + await ctx.enqueue( + 'ecommerce.shipping.reserve', + args: {'orderId': orderId, 'carrier': 'acme-post'}, + options: const TaskOptions(queue: 'default'), + meta: {'workflow': checkoutWorkflowName, 'step': 'emit-side-effects'}, + ); return order; }); }, ); } - -Map _mapFromDynamic(Object? value) { - if (value is Map) { - return value; - } - if (value is Map) { - return value.cast(); - } - return {}; -} diff --git a/packages/stem/example/email_service/bin/enqueuer.dart b/packages/stem/example/email_service/bin/enqueuer.dart index a6169920..5fa20e83 100644 --- a/packages/stem/example/email_service/bin/enqueuer.dart +++ b/packages/stem/example/email_service/bin/enqueuer.dart @@ -10,19 +10,10 @@ import 'package:stem_redis/stem_redis.dart'; Future main(List args) async { final config = StemConfig.fromEnvironment(); - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); - final backend = config.resultBackendUrl != null - ? await RedisResultBackend.connect( - config.resultBackendUrl!, - tls: config.tls, - ) - : null; + final backendUrl = config.resultBackendUrl; final signer = PayloadSigner.maybe(config.signing); - if (backend == null) { + if (backendUrl == null) { stderr.writeln( 'STEM_RESULT_BACKEND_URL must be provided for the email service.', ); @@ -34,13 +25,14 @@ Future main(List args) async { name: 'email.send', entrypoint: _placeholderEntrypoint, options: const TaskOptions(queue: 'emails', maxRetries: 3), - ), + ), ]; - final stem = Stem( - broker: broker, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: signer, ); @@ -62,7 +54,7 @@ Future main(List args) async { }), ); } - final taskId = await stem.enqueue( + final taskId = await client.enqueue( 'email.send', args: {'to': to, 'subject': subject, 'body': emailBody}, options: const TaskOptions(queue: 'emails'), @@ -85,8 +77,7 @@ Future main(List args) async { Future shutdown(ProcessSignal signal) async { stdout.writeln('Shutting down email enqueue service ($signal)...'); await server.close(force: true); - await broker.close(); - await backend.close(); + await client.close(); exit(0); } diff --git a/packages/stem/example/encrypted_payload/docker/main.dart b/packages/stem/example/encrypted_payload/docker/main.dart index a933fc25..4e5906dc 100644 --- a/packages/stem/example/encrypted_payload/docker/main.dart +++ b/packages/stem/example/encrypted_payload/docker/main.dart @@ -10,10 +10,13 @@ Future main(List args) async { final config = StemConfig.fromEnvironment(); final secretKey = SecretKey(base64Decode(_requireEnv('PAYLOAD_SECRET'))); final cipher = AesGcm.with256bits(); - - // Build Stem client - final broker = await RedisStreamsBroker.connect(config.brokerUrl); - final backend = await RedisResultBackend.connect(config.resultBackendUrl!); + final backendUrl = config.resultBackendUrl; + if (backendUrl == null) { + throw StateError( + 'STEM_RESULT_BACKEND_URL must be set ' + 'for the encrypted container example.', + ); + } final tasks = >[ FunctionTaskHandler( @@ -23,7 +26,17 @@ Future main(List args) async { ), ]; - final stem = Stem(broker: broker, tasks: tasks, backend: backend); + final client = await StemClient.create( + broker: StemBrokerFactory( + create: () => RedisStreamsBroker.connect(config.brokerUrl), + dispose: (broker) => broker.close(), + ), + backend: StemBackendFactory( + create: () => RedisResultBackend.connect(backendUrl), + dispose: (backend) => backend.close(), + ), + tasks: tasks, + ); final jobs = [ {'customerId': 'cust-1001', 'amount': 1250.75}, @@ -43,7 +56,7 @@ Future main(List args) async { 'mac': base64Encode(box.mac.bytes), }; - final id = await stem.enqueue( + final id = await client.enqueue( 'secure.report', args: payload, options: TaskOptions(queue: config.defaultQueue), @@ -51,8 +64,7 @@ Future main(List args) async { stdout.writeln('Container task $id for ${job['customerId']}'); } - await broker.close(); - await backend.close(); + await client.close(); } FutureOr _noopEntrypoint( diff --git a/packages/stem/example/encrypted_payload/enqueuer/bin/enqueue.dart b/packages/stem/example/encrypted_payload/enqueuer/bin/enqueue.dart index 62238454..5869b88b 100644 --- a/packages/stem/example/encrypted_payload/enqueuer/bin/enqueue.dart +++ b/packages/stem/example/encrypted_payload/enqueuer/bin/enqueue.dart @@ -10,19 +10,12 @@ Future main(List args) async { final config = StemConfig.fromEnvironment(); final secretKey = SecretKey(base64Decode(_requireEnv('PAYLOAD_SECRET'))); final cipher = AesGcm.with256bits(); - - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); final backendUrl = config.resultBackendUrl; if (backendUrl == null) { throw StateError( 'STEM_RESULT_BACKEND_URL must be set for the encrypted example.', ); } - final backend = await RedisResultBackend.connect(backendUrl, tls: config.tls); - final tasks = >[ FunctionTaskHandler( name: 'secure.report', @@ -31,10 +24,11 @@ Future main(List args) async { ), ]; - final stem = Stem( - broker: broker, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: PayloadSigner.maybe(config.signing), ); @@ -57,7 +51,7 @@ Future main(List args) async { 'mac': base64Encode(box.mac.bytes), }; - final taskId = await stem.enqueue( + final taskId = await client.enqueue( 'secure.report', args: payload, options: TaskOptions(queue: config.defaultQueue), @@ -65,8 +59,7 @@ Future main(List args) async { stdout.writeln('Enqueued secure task $taskId for ${job['customerId']}'); } - await broker.close(); - await backend.close(); + await client.close(); } FutureOr _noopEntrypoint( diff --git a/packages/stem/example/image_processor/bin/api.dart b/packages/stem/example/image_processor/bin/api.dart index c60b8df5..ab0afb08 100644 --- a/packages/stem/example/image_processor/bin/api.dart +++ b/packages/stem/example/image_processor/bin/api.dart @@ -10,19 +10,10 @@ import 'package:stem_redis/stem_redis.dart'; Future main(List args) async { final config = StemConfig.fromEnvironment(); - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); - final backend = config.resultBackendUrl != null - ? await RedisResultBackend.connect( - config.resultBackendUrl!, - tls: config.tls, - ) - : null; + final backendUrl = config.resultBackendUrl; final signer = PayloadSigner.maybe(config.signing); - if (backend == null) { + if (backendUrl == null) { stderr.writeln( 'STEM_RESULT_BACKEND_URL must be provided for the image service.', ); @@ -37,10 +28,11 @@ Future main(List args) async { ), ]; - final stem = Stem( - broker: broker, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: signer, ); @@ -53,7 +45,7 @@ Future main(List args) async { body: jsonEncode({'error': 'Missing "imageUrl" field'}), ); } - final taskId = await stem.enqueue( + final taskId = await client.enqueue( 'image.generate_thumbnail', args: {'imageUrl': imageUrl}, options: const TaskOptions(queue: 'images'), @@ -76,8 +68,7 @@ Future main(List args) async { Future shutdown(ProcessSignal signal) async { stdout.writeln('Shutting down image service ($signal)...'); await server.close(force: true); - await broker.close(); - await backend.close(); + await client.close(); exit(0); } diff --git a/packages/stem/example/microservice/enqueuer/bin/main.dart b/packages/stem/example/microservice/enqueuer/bin/main.dart index fb868f6d..5f3984df 100644 --- a/packages/stem/example/microservice/enqueuer/bin/main.dart +++ b/packages/stem/example/microservice/enqueuer/bin/main.dart @@ -88,20 +88,12 @@ Future main(List args) async { observability.applyMetricExporters(); observability.applySignalConfiguration(); - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); final backendUrl = config.resultBackendUrl; if (backendUrl == null) { throw StateError( 'STEM_RESULT_BACKEND_URL must be configured for the microservice enqueuer.', ); } - final backend = await RedisResultBackend.connect( - backendUrl, - tls: config.tls, - ); // #region signing-producer-signer final signer = PayloadSigner.maybe(config.signing); // #endregion signing-producer-signer @@ -118,20 +110,17 @@ Future main(List args) async { .toList(growable: false); // #region signing-producer-stem - final stem = Stem( - broker: broker, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: signer, ); // #endregion signing-producer-stem - final canvas = Canvas( - broker: broker, - backend: backend, - tasks: tasks, - ); + final canvas = client.createCanvas(tasks: tasks); final autoFill = _AutoFillController( - stem: stem, + enqueuer: client, enabled: _boolFromEnv( Platform.environment['ENQUEUER_AUTOFILL_ENABLED'], defaultValue: true, @@ -172,7 +161,7 @@ Future main(List args) async { } final name = (body['name'] as String?)?.trim(); final entity = (name == null || name.isEmpty) ? 'friend' : name; - final taskId = await stem.enqueue( + final taskId = await client.enqueue( taskSpec.name, args: { 'name': entity, @@ -221,7 +210,7 @@ Future main(List args) async { ); }) ..get('/group/', (Request request, String groupId) async { - final status = await backend.getGroup(groupId); + final status = await client.getGroupStatus(groupId); if (status == null) { return Response.notFound( jsonEncode({'error': 'Unknown group or expired results'}), @@ -266,8 +255,7 @@ Future main(List args) async { stdout.writeln('Shutting down enqueue service ($signal)...'); autoFill.stop(); await server.close(force: true); - await broker.close(); - await backend.close(); + await client.close(); exit(0); } @@ -300,14 +288,14 @@ SecurityContext? _buildHttpSecurityContext() { class _AutoFillController { _AutoFillController({ - required this.stem, + required this.enqueuer, required this.enabled, required this.interval, required this.batchSize, required this.failureEvery, }); - final Stem stem; + final TaskEnqueuer enqueuer; final bool enabled; final Duration interval; final int batchSize; @@ -368,7 +356,7 @@ class _AutoFillController { required bool shouldFail, Map extraMeta = const {}, }) { - return stem.enqueue( + return enqueuer.enqueue( spec.name, args: { 'name': label, diff --git a/packages/stem/example/mixed_cluster/enqueuer/bin/enqueue.dart b/packages/stem/example/mixed_cluster/enqueuer/bin/enqueue.dart index e5c53258..7ede601b 100644 --- a/packages/stem/example/mixed_cluster/enqueuer/bin/enqueue.dart +++ b/packages/stem/example/mixed_cluster/enqueuer/bin/enqueue.dart @@ -3,19 +3,18 @@ import 'dart:io'; import 'package:stem/stem.dart'; import 'package:stem_postgres/stem_postgres.dart'; -import 'package:stem_postgres/stem_postgres.dart'; import 'package:stem_redis/stem_redis.dart'; Future main(List args) async { final redisConfig = _configFromPrefix('REDIS_'); final postgresConfig = _configFromPrefix('POSTGRES_'); - final redisStem = await _buildRedisStem(redisConfig); - final postgresStem = await _buildPostgresStem(postgresConfig); + final redisClient = await _buildRedisClient(redisConfig); + final postgresClient = await _buildPostgresClient(postgresConfig); final redisItems = ['cache-warmup', 'metrics-snapshot']; for (final item in redisItems) { - final id = await redisStem.enqueue( + final id = await redisClient.enqueue( 'redis.only', args: {'task': item}, options: TaskOptions(queue: redisConfig.defaultQueue), @@ -25,7 +24,7 @@ Future main(List args) async { final postgresItems = ['billing-report', 'inventory-rollup']; for (final item in postgresItems) { - final id = await postgresStem.enqueue( + final id = await postgresClient.enqueue( 'postgres.only', args: {'task': item}, options: TaskOptions(queue: postgresConfig.defaultQueue), @@ -33,10 +32,8 @@ Future main(List args) async { stdout.writeln('Enqueued Postgres task $id for $item'); } - await (redisStem.broker as RedisStreamsBroker).close(); - await (redisStem.backend as RedisResultBackend).close(); - await (postgresStem.broker as PostgresBroker).close(); - await (postgresStem.backend as PostgresResultBackend).close(); + await redisClient.close(); + await postgresClient.close(); } StemConfig _configFromPrefix(String prefix) { @@ -52,16 +49,11 @@ StemConfig _configFromPrefix(String prefix) { return StemConfig.fromEnvironment(overrides); } -Future _buildRedisStem(StemConfig config) async { - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); +Future _buildRedisClient(StemConfig config) async { final backendUrl = config.resultBackendUrl; if (backendUrl == null) { throw StateError('STEM_RESULT_BACKEND_URL must be set for Redis Stem'); } - final backend = await RedisResultBackend.connect(backendUrl, tls: config.tls); final tasks = >[ FunctionTaskHandler( @@ -71,27 +63,20 @@ Future _buildRedisStem(StemConfig config) async { ), ]; - return Stem( - broker: broker, + return StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: PayloadSigner.maybe(config.signing), ); } -Future _buildPostgresStem(StemConfig config) async { - final broker = await PostgresBroker.connect( - config.brokerUrl, - applicationName: 'stem-mixed-enqueuer', - tls: config.tls, - ); +Future _buildPostgresClient(StemConfig config) async { final backendUrl = config.resultBackendUrl; if (backendUrl == null) { throw StateError('STEM_RESULT_BACKEND_URL must be set for Postgres Stem'); } - final backend = await PostgresResultBackend.connect( - connectionString: backendUrl, - ); final tasks = >[ FunctionTaskHandler( @@ -101,10 +86,16 @@ Future _buildPostgresStem(StemConfig config) async { ), ]; - return Stem( - broker: broker, + return StemClient.fromUrl( + config.brokerUrl, + adapters: [ + StemPostgresAdapter( + applicationName: 'stem-mixed-enqueuer', + tls: config.tls, + ), + ], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: PayloadSigner.maybe(config.signing), ); } diff --git a/packages/stem/example/ops_health_suite/bin/producer.dart b/packages/stem/example/ops_health_suite/bin/producer.dart index 8de24d63..32a170e2 100644 --- a/packages/stem/example/ops_health_suite/bin/producer.dart +++ b/packages/stem/example/ops_health_suite/bin/producer.dart @@ -1,14 +1,18 @@ import 'dart:io'; import 'package:stem/stem.dart'; +import 'package:stem_redis/stem_redis.dart'; import 'package:stem_ops_health_suite/shared.dart'; Future main() async { final config = StemConfig.fromEnvironment(); - final broker = await connectBroker(config.brokerUrl, tls: config.tls); final backendUrl = config.resultBackendUrl ?? config.brokerUrl; - final backend = await connectBackend(backendUrl, tls: config.tls); - final tasks = buildTasks(); + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), + tasks: buildTasks(), + ); final taskCount = _parseInt('TASKS', fallback: 6, min: 1); final delayMs = _parseInt('DELAY_MS', fallback: 400, min: 0); @@ -17,12 +21,11 @@ Future main() async { '[producer] broker=${config.brokerUrl} backend=$backendUrl tasks=$taskCount', ); - final stem = Stem(broker: broker, tasks: tasks, backend: backend); const options = TaskOptions(queue: opsQueue); for (var i = 0; i < taskCount; i += 1) { final label = 'health-${i + 1}'; - final id = await stem.enqueue( + final id = await client.enqueue( 'ops.ping', options: options, args: {'label': label, 'delayMs': delayMs}, @@ -30,8 +33,7 @@ Future main() async { stdout.writeln('[producer] enqueued $label id=$id'); } - await broker.close(); - await backend.close(); + await client.close(); } int _parseInt(String key, {required int fallback, int min = 0}) { diff --git a/packages/stem/example/otel_metrics/bin/worker.dart b/packages/stem/example/otel_metrics/bin/worker.dart index 36974825..e351c9b5 100644 --- a/packages/stem/example/otel_metrics/bin/worker.dart +++ b/packages/stem/example/otel_metrics/bin/worker.dart @@ -3,10 +3,12 @@ import 'dart:io'; import 'package:stem/stem.dart'; +final pingDefinition = TaskDefinition.noArgs(name: 'metrics.ping'); + Future main() async { final tasks = >[ FunctionTaskHandler( - name: 'metrics.ping', + name: pingDefinition.name, entrypoint: (context, _) async { // Simulate a bit of work. await Future.delayed(const Duration(milliseconds: 150)); @@ -16,9 +18,6 @@ Future main() async { ), ]; - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); - final otlpEndpoint = Platform.environment['STEM_OTLP_ENDPOINT'] ?? 'http://localhost:4318/v1/metrics'; @@ -28,16 +27,17 @@ Future main() async { metricExporters: ['otlp:$otlpEndpoint'], ); - final worker = Worker( - broker: broker, + final client = await StemClient.inMemory( tasks: tasks, - backend: backend, - consumerName: 'otel-demo-worker', - observability: observability, - heartbeatTransport: const NoopHeartbeatTransport(), ); - - final stem = Stem(broker: broker, tasks: tasks, backend: backend); + final worker = await client.createWorker( + workerConfig: const StemWorkerConfig( + consumerName: 'otel-demo-worker', + heartbeatTransport: NoopHeartbeatTransport(), + ).copyWith( + observability: observability, + ), + ); await worker.start(); print( @@ -45,6 +45,6 @@ Future main() async { ); Timer.periodic(const Duration(seconds: 1), (_) async { - await stem.enqueue('metrics.ping'); + await pingDefinition.enqueue(client); }); } diff --git a/packages/stem/example/persistent_sleep.dart b/packages/stem/example/persistent_sleep.dart index cf5555d2..340a4b06 100644 --- a/packages/stem/example/persistent_sleep.dart +++ b/packages/stem/example/persistent_sleep.dart @@ -7,39 +7,36 @@ import 'package:stem/stem.dart'; /// completes on the next invocation. Future main() async { var iterations = 0; + final sleepLoop = Flow( + name: 'sleep.loop.workflow', + build: (flow) { + flow.step('loop', (ctx) async { + iterations += 1; + if (iterations == 1) { + ctx.sleep(const Duration(milliseconds: 100)); + return 'waiting'; + } + return 'resumed'; + }); + }, + ); final app = await StemWorkflowApp.inMemory( - flows: [ - Flow( - name: 'sleep.loop.workflow', - build: (flow) { - flow.step('loop', (ctx) async { - iterations += 1; - if (iterations == 1) { - ctx.sleep(const Duration(milliseconds: 100)); - return 'waiting'; - } - return 'resumed'; - }); - }, - ), - ], + flows: [sleepLoop], ); - final runId = await app.startWorkflow('sleep.loop.workflow'); - await app.runtime.executeRun(runId); + final runId = await sleepLoop.start(app); + await app.executeRun(runId); // After the delay elapses, the runtime should resume without the step // manually inspecting resume data. await Future.delayed(const Duration(milliseconds: 150)); - final due = await app.store.dueRuns(DateTime.now()); + final due = await app.resumeDueRuns(DateTime.now()); for (final id in due) { - final state = await app.store.get(id); - await app.store.markResumed(id, data: state?.suspensionData); - await app.runtime.executeRun(id); + await app.executeRun(id); } - final completed = await app.store.get(runId); - print('Workflow completed with result: ${completed?.result}'); + final completed = await sleepLoop.waitFor(app, runId); + print('Workflow completed with result: ${completed?.value}'); await app.close(); } diff --git a/packages/stem/example/postgres_tls/bin/enqueue.dart b/packages/stem/example/postgres_tls/bin/enqueue.dart index ac4284f1..7065877c 100644 --- a/packages/stem/example/postgres_tls/bin/enqueue.dart +++ b/packages/stem/example/postgres_tls/bin/enqueue.dart @@ -8,11 +8,6 @@ import 'package:stem_redis/stem_redis.dart'; Future main(List args) async { final config = StemConfig.fromEnvironment(); - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); - final backendUrl = config.resultBackendUrl; if (backendUrl == null) { throw StateError( @@ -20,10 +15,6 @@ Future main(List args) async { ); } - final backend = await PostgresResultBackend.connect( - connectionString: backendUrl, - ); - final tasks = >[ FunctionTaskHandler( name: 'reports.generate', @@ -32,16 +23,27 @@ Future main(List args) async { ), ]; - final stem = Stem( - broker: broker, + final client = await StemClient.create( + broker: StemBrokerFactory( + create: () => RedisStreamsBroker.connect( + config.brokerUrl, + tls: config.tls, + ), + dispose: (broker) => broker.close(), + ), + backend: StemBackendFactory( + create: () => PostgresResultBackend.connect( + connectionString: backendUrl, + ), + dispose: (backend) => backend.close(), + ), tasks: tasks, - backend: backend, signer: PayloadSigner.maybe(config.signing), ); final regions = ['emea', 'amer', 'apac']; for (final region in regions) { - final id = await stem.enqueue( + final id = await client.enqueue( 'reports.generate', args: {'region': region}, options: TaskOptions(queue: config.defaultQueue), @@ -49,8 +51,7 @@ Future main(List args) async { stdout.writeln('Enqueued TLS demo task $id for $region'); } - await broker.close(); - await backend.close(); + await client.close(); } FutureOr _noop( diff --git a/packages/stem/example/postgres_worker/enqueuer/bin/enqueue.dart b/packages/stem/example/postgres_worker/enqueuer/bin/enqueue.dart index a15e9169..d9e3bb6c 100644 --- a/packages/stem/example/postgres_worker/enqueuer/bin/enqueue.dart +++ b/packages/stem/example/postgres_worker/enqueuer/bin/enqueue.dart @@ -7,12 +7,6 @@ import 'package:stem_postgres/stem_postgres.dart'; Future main(List args) async { final config = StemConfig.fromEnvironment(); - final broker = await PostgresBroker.connect( - config.brokerUrl, - applicationName: 'stem-postgres-enqueuer', - tls: config.tls, - ); - final backendUrl = config.resultBackendUrl; if (backendUrl == null) { throw StateError( @@ -20,10 +14,6 @@ Future main(List args) async { ); } - final backend = await PostgresResultBackend.connect( - connectionString: backendUrl, - ); - final tasks = >[ FunctionTaskHandler( name: 'report.generate', @@ -32,16 +22,22 @@ Future main(List args) async { ), ]; - final stem = Stem( - broker: broker, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [ + StemPostgresAdapter( + applicationName: 'stem-postgres-enqueuer', + tls: config.tls, + ), + ], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: PayloadSigner.maybe(config.signing), ); final regions = ['us-east', 'eu-west', 'ap-south']; for (final region in regions) { - final taskId = await stem.enqueue( + final taskId = await client.enqueue( 'report.generate', args: {'region': region}, options: TaskOptions(queue: config.defaultQueue), @@ -49,8 +45,7 @@ Future main(List args) async { stdout.writeln('Enqueued report job $taskId for $region'); } - await broker.close(); - await backend.close(); + await client.close(); } FutureOr _noopEntrypoint( diff --git a/packages/stem/example/progress_heartbeat/bin/producer.dart b/packages/stem/example/progress_heartbeat/bin/producer.dart index 7a235f09..37ad95e7 100644 --- a/packages/stem/example/progress_heartbeat/bin/producer.dart +++ b/packages/stem/example/progress_heartbeat/bin/producer.dart @@ -1,6 +1,7 @@ import 'dart:io'; import 'package:stem/stem.dart'; +import 'package:stem_redis/stem_redis.dart'; import 'package:stem_progress_heartbeat/shared.dart'; Future main() async { @@ -17,19 +18,16 @@ Future main() async { '[producer] broker=$brokerUrl backend=$backendUrl tasks=$taskCount', ); - final broker = await connectBroker(brokerUrl); - final backend = await connectBackend(backendUrl); - final tasks = buildTasks(); - - final stem = Stem( - broker: broker, - tasks: tasks, - backend: backend, + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides(backend: backendUrl), + tasks: buildTasks(), ); const taskOptions = TaskOptions(queue: progressQueue); for (var i = 0; i < taskCount; i += 1) { - final id = await stem.enqueue( + final id = await client.enqueue( 'progress.demo', options: taskOptions, args: {'steps': steps, 'delayMs': delayMs}, @@ -37,6 +35,5 @@ Future main() async { stdout.writeln('[producer] enqueued progress.demo id=$id'); } - await broker.close(); - await backend.close(); + await client.close(); } diff --git a/packages/stem/example/rate_limit_delay/bin/producer.dart b/packages/stem/example/rate_limit_delay/bin/producer.dart index 0262338f..fc6e3404 100644 --- a/packages/stem/example/rate_limit_delay/bin/producer.dart +++ b/packages/stem/example/rate_limit_delay/bin/producer.dart @@ -1,6 +1,7 @@ import 'dart:io'; import 'package:stem/stem.dart'; +import 'package:stem_redis/stem_redis.dart'; import 'package:stem_rate_limit_delay_demo/shared.dart'; Future main() async { @@ -11,14 +12,13 @@ Future main() async { stdout.writeln('[producer] connecting broker=$brokerUrl backend=$backendUrl'); - final broker = await connectBroker(brokerUrl); - final backend = await connectBackend(backendUrl); final tasks = buildTasks(); final routing = buildRoutingRegistry(); - final stem = buildStem( - broker: broker, + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, routing: routing, ); @@ -45,7 +45,7 @@ Future main() async { ); final appliedPriority = route.effectivePriority(priority); - final id = await stem.enqueue( + final id = await client.enqueue( taskName(), args: { 'job': i + 1, @@ -75,7 +75,6 @@ Future main() async { stdout.writeln('[producer] all jobs queued. Waiting 5s before shutdown...'); await Future.delayed(const Duration(seconds: 5)); - await broker.close(); - await backend.close(); + await client.close(); stdout.writeln('[producer] done.'); } diff --git a/packages/stem/example/rate_limit_delay/lib/shared.dart b/packages/stem/example/rate_limit_delay/lib/shared.dart index c7cfa765..849b838f 100644 --- a/packages/stem/example/rate_limit_delay/lib/shared.dart +++ b/packages/stem/example/rate_limit_delay/lib/shared.dart @@ -39,20 +39,6 @@ RoutingRegistry buildRoutingRegistry() { return RoutingRegistry(config); } -Stem buildStem({ - required Broker broker, - required Iterable> tasks, - ResultBackend? backend, - RoutingRegistry? routing, -}) { - return Stem( - broker: broker, - tasks: tasks, - backend: backend, - routing: routing, - ); -} - Future connectBroker(String uri) => RedisStreamsBroker.connect(uri); diff --git a/packages/stem/example/redis_postgres_worker/enqueuer/bin/enqueue.dart b/packages/stem/example/redis_postgres_worker/enqueuer/bin/enqueue.dart index ef8ea10c..021e751c 100644 --- a/packages/stem/example/redis_postgres_worker/enqueuer/bin/enqueue.dart +++ b/packages/stem/example/redis_postgres_worker/enqueuer/bin/enqueue.dart @@ -8,11 +8,6 @@ import 'package:stem_redis/stem_redis.dart'; Future main(List args) async { final config = StemConfig.fromEnvironment(); - final broker = await RedisStreamsBroker.connect( - config.brokerUrl, - tls: config.tls, - ); - final backendUrl = config.resultBackendUrl; if (backendUrl == null) { throw StateError( @@ -20,10 +15,6 @@ Future main(List args) async { ); } - final backend = await PostgresResultBackend.connect( - connectionString: backendUrl, - ); - final tasks = >[ FunctionTaskHandler( name: 'hybrid.process', @@ -32,16 +23,20 @@ Future main(List args) async { ), ]; - final stem = Stem( - broker: broker, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [ + StemRedisAdapter(tls: config.tls), + StemPostgresAdapter(), + ], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: PayloadSigner.maybe(config.signing), ); final items = ['alpha', 'beta', 'gamma']; for (final item in items) { - final taskId = await stem.enqueue( + final taskId = await client.enqueue( 'hybrid.process', args: {'item': item}, options: TaskOptions(queue: config.defaultQueue), @@ -49,8 +44,7 @@ Future main(List args) async { stdout.writeln('Enqueued hybrid job $taskId for $item'); } - await broker.close(); - await backend.close(); + await client.close(); } FutureOr _noop( diff --git a/packages/stem/example/retry_task/bin/producer.dart b/packages/stem/example/retry_task/bin/producer.dart index f05e6549..09378fbb 100644 --- a/packages/stem/example/retry_task/bin/producer.dart +++ b/packages/stem/example/retry_task/bin/producer.dart @@ -9,12 +9,14 @@ Future main() async { final brokerUrl = Platform.environment['STEM_BROKER_URL'] ?? 'redis://redis:6379/0'; - final broker = await RedisStreamsBroker.connect(brokerUrl); - final tasks = buildTasks(); final subscriptions = attachLogging('producer'); - final stem = Stem(broker: broker, tasks: tasks); + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + tasks: buildTasks(), + ); - final taskId = await stem.enqueue( + final taskId = await client.enqueue( 'tasks.always_fail', options: const TaskOptions(maxRetries: 3, queue: 'retry-demo'), meta: const {'maxRetries': 3}, @@ -28,5 +30,5 @@ Future main() async { } await Future.delayed(const Duration(seconds: 1)); - await broker.close(); + await client.close(); } diff --git a/packages/stem/example/routing_parity/bin/publisher.dart b/packages/stem/example/routing_parity/bin/publisher.dart index d41f78aa..4b5e411d 100644 --- a/packages/stem/example/routing_parity/bin/publisher.dart +++ b/packages/stem/example/routing_parity/bin/publisher.dart @@ -10,14 +10,10 @@ Future main() async { final routing = buildRoutingRegistry(); final tasks = buildDemoTasks(); - - final broker = await RedisStreamsBroker.connect( + final client = await StemClient.fromUrl( redisUrl, - namespace: 'stem-routing-demo', - ); - - final stem = Stem( - broker: broker, + adapters: const [StemRedisAdapter(namespace: 'stem-routing-demo')], + overrides: const StemStoreOverrides(backend: 'memory://'), tasks: tasks, routing: routing, ); @@ -25,27 +21,27 @@ Future main() async { stdout.writeln('Publishing demo tasks using routing parity features...'); await _enqueueWithRouting( - stem, + client, routing, 'billing.invoice', args: const {'invoiceId': 101}, ); await _enqueueWithRouting( - stem, + client, routing, 'billing.invoice', args: const {'invoiceId': 102}, ); await _enqueueWithRouting( - stem, + client, routing, 'reports.generate', args: const {'subject': 'Quarterly summary', 'priority': 'low'}, options: const TaskOptions(priority: 1), ); await _enqueueWithRouting( - stem, + client, routing, 'reports.generate', args: const {'subject': 'Incident post-mortem', 'priority': 'high'}, @@ -53,7 +49,7 @@ Future main() async { ); await _enqueueWithRouting( - stem, + client, routing, 'ops.status', args: const {'message': 'Maintenance window begins at 02:00 UTC.'}, @@ -61,11 +57,11 @@ Future main() async { stdout .writeln('All demo tasks enqueued. Watch the worker output for results.'); - await broker.close(); + await client.close(); } Future _enqueueWithRouting( - Stem stem, + TaskEnqueuer enqueuer, RoutingRegistry routing, String name, { Map args = const {}, @@ -88,7 +84,7 @@ Future _enqueueWithRouting( ...meta, }; - final id = await stem.enqueue( + final id = await enqueuer.enqueue( name, args: args, headers: headers, diff --git a/packages/stem/example/signals_demo/bin/producer.dart b/packages/stem/example/signals_demo/bin/producer.dart index abd7ce9a..e3c108a8 100644 --- a/packages/stem/example/signals_demo/bin/producer.dart +++ b/packages/stem/example/signals_demo/bin/producer.dart @@ -12,28 +12,26 @@ Future main() async { registerSignalLogging('producer'); - final broker = await RedisStreamsBroker.connect(brokerUrl); - final tasks = buildTasks(); - final stem = Stem( - broker: broker, - tasks: tasks, - backend: InMemoryResultBackend(), + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + tasks: buildTasks(), ); final timer = Timer.periodic(const Duration(seconds: 5), (_) async { - await stem.enqueue( + await client.enqueue( 'tasks.hello', args: {'name': 'from-producer'}, ); - await stem.enqueue('tasks.flaky'); - await stem.enqueue('tasks.always_fail'); + await flakyTaskDefinition.enqueue(client); + await alwaysFailTaskDefinition.enqueue(client); }); void scheduleShutdown(ProcessSignal signal) async { // ignore: avoid_print print('[signals][producer] received $signal, shutting down'); timer.cancel(); - await broker.close(); + await client.close(); exit(0); } diff --git a/packages/stem/example/signals_demo/lib/shared.dart b/packages/stem/example/signals_demo/lib/shared.dart index 39b3c4db..333d5f7b 100644 --- a/packages/stem/example/signals_demo/lib/shared.dart +++ b/packages/stem/example/signals_demo/lib/shared.dart @@ -3,6 +3,11 @@ import 'dart:convert'; import 'package:stem/stem.dart'; +final flakyTaskDefinition = TaskDefinition.noArgs(name: 'tasks.flaky'); +final alwaysFailTaskDefinition = TaskDefinition.noArgs( + name: 'tasks.always_fail', +); + List> buildTasks() => [ FunctionTaskHandler( name: 'tasks.hello', @@ -10,12 +15,12 @@ List> buildTasks() => [ options: const TaskOptions(maxRetries: 0), ), FunctionTaskHandler( - name: 'tasks.flaky', + name: flakyTaskDefinition.name, entrypoint: _flakyEntrypoint, options: const TaskOptions(maxRetries: 2), ), FunctionTaskHandler( - name: 'tasks.always_fail', + name: alwaysFailTaskDefinition.name, entrypoint: _alwaysFailEntrypoint, options: const TaskOptions(maxRetries: 1), ), diff --git a/packages/stem/example/signing_key_rotation/bin/producer.dart b/packages/stem/example/signing_key_rotation/bin/producer.dart index 4f95af13..c5bef9f7 100644 --- a/packages/stem/example/signing_key_rotation/bin/producer.dart +++ b/packages/stem/example/signing_key_rotation/bin/producer.dart @@ -1,25 +1,25 @@ import 'dart:io'; import 'package:stem/stem.dart'; +import 'package:stem_redis/stem_redis.dart'; import 'package:stem_signing_key_rotation/shared.dart'; Future main() async { // #region signing-rotation-producer-config final config = StemConfig.fromEnvironment(); // #endregion signing-rotation-producer-config - final broker = await connectBroker(config.brokerUrl, tls: config.tls); final backendUrl = config.resultBackendUrl ?? config.brokerUrl; - final backend = await connectBackend(backendUrl, tls: config.tls); // #region signing-rotation-producer-signer final signer = PayloadSigner.maybe(config.signing); // #endregion signing-rotation-producer-signer final tasks = buildTasks(); // #region signing-rotation-producer-stem - final stem = Stem( - broker: broker, + final client = await StemClient.fromUrl( + config.brokerUrl, + adapters: [StemRedisAdapter(tls: config.tls)], + overrides: StemStoreOverrides(backend: backendUrl), tasks: tasks, - backend: backend, signer: signer, ); // #endregion signing-rotation-producer-stem @@ -40,7 +40,7 @@ Future main() async { const options = TaskOptions(queue: rotationQueue); for (var i = 0; i < taskCount; i += 1) { final label = 'rotation-${i + 1}'; - final id = await stem.enqueue( + final id = await client.enqueue( 'rotation.demo', options: options, args: {'label': label, 'key': keyId}, @@ -49,8 +49,7 @@ Future main() async { } // #endregion signing-rotation-producer-enqueue - await broker.close(); - await backend.close(); + await client.close(); } int _parseInt(String key, {required int fallback, int min = 0}) { diff --git a/packages/stem/example/stack_autowire.dart b/packages/stem/example/stack_autowire.dart index f1a3a4e2..489de415 100644 --- a/packages/stem/example/stack_autowire.dart +++ b/packages/stem/example/stack_autowire.dart @@ -4,11 +4,13 @@ import 'package:stem/stem.dart'; import 'package:stem_redis/stem_redis.dart'; class PingTask implements TaskHandler { + static final definition = TaskDefinition.noArgs(name: 'demo.ping'); + @override - String get name => 'demo.ping'; + String get name => definition.name; @override - TaskMetadata get metadata => const TaskMetadata(); + TaskMetadata get metadata => definition.metadata; @override TaskOptions get options => const TaskOptions(maxRetries: 0); @@ -61,12 +63,13 @@ Future main() async { ); try { - await app.start(); await workflowApp.start(); await beat.start(); - await app.stem.enqueue('demo.ping'); - await Future.delayed(const Duration(seconds: 1)); + await PingTask.definition.enqueueAndWait( + app, + timeout: const Duration(seconds: 1), + ); } finally { await beat.stop(); await workflowApp.shutdown(); diff --git a/packages/stem/example/stem_example.dart b/packages/stem/example/stem_example.dart index 1edd6285..a5d464e8 100644 --- a/packages/stem/example/stem_example.dart +++ b/packages/stem/example/stem_example.dart @@ -1,4 +1,3 @@ -import 'dart:async'; import 'package:stem/stem.dart'; import 'package:stem_redis/stem_redis.dart'; @@ -47,27 +46,25 @@ class HelloArgs { Future main() async { // #region getting-started-runtime-setup - final broker = await RedisStreamsBroker.connect('redis://localhost:6379'); - final backend = await RedisResultBackend.connect('redis://localhost:6379/1'); - - final stem = Stem(broker: broker, backend: backend, tasks: [HelloTask()]); - final worker = Worker( - broker: broker, - backend: backend, + final app = await StemApp.fromUrl( + 'redis://localhost:6379', tasks: [HelloTask()], + adapters: const [StemRedisAdapter()], + overrides: const StemStoreOverrides(backend: 'redis://localhost:6379/1'), ); // #endregion getting-started-runtime-setup // #region getting-started-enqueue - unawaited(worker.start()); // Map-based enqueue for quick scripts or one-off calls. - await stem.enqueue('demo.hello', args: {'name': 'Stem'}); + final taskId = await app.enqueue('demo.hello', args: {'name': 'Stem'}); + await app.waitForTask(taskId, timeout: const Duration(seconds: 2)); // Typed helper with TaskDefinition for compile-time safety. - await stem.enqueueCall(HelloTask.definition(const HelloArgs(name: 'Stem'))); - await Future.delayed(const Duration(seconds: 1)); - await worker.shutdown(); - await broker.close(); - await backend.close(); + await HelloTask.definition.enqueueAndWait( + app, + const HelloArgs(name: 'Stem'), + timeout: const Duration(seconds: 2), + ); + await app.close(); // #endregion getting-started-enqueue } diff --git a/packages/stem/example/task_context_mixed/README.md b/packages/stem/example/task_context_mixed/README.md index f1c94323..a80fc637 100644 --- a/packages/stem/example/task_context_mixed/README.md +++ b/packages/stem/example/task_context_mixed/README.md @@ -5,6 +5,9 @@ handlers) and `TaskInvocationContext` (inline + isolate entrypoints). It also shows the full `TaskEnqueueOptions` / `TaskRetryPolicy` surface that mirrors Celery-style `apply_async` controls. +Both concrete contexts share the same `TaskExecutionContext` surface for +enqueueing, progress reporting, workflow/event calls, and retry requests. + ## Requirements - Dart 3.3+ diff --git a/packages/stem/example/task_context_mixed/bin/enqueue.dart b/packages/stem/example/task_context_mixed/bin/enqueue.dart index db625577..4becff59 100644 --- a/packages/stem/example/task_context_mixed/bin/enqueue.dart +++ b/packages/stem/example/task_context_mixed/bin/enqueue.dart @@ -4,16 +4,19 @@ import 'package:stem/stem.dart'; import 'package:stem_task_context_mixed_example/shared.dart'; Future main(List args) async { - final broker = await connectBroker(); final tasks = buildTasks(); - final stem = Stem(broker: broker, tasks: tasks); + final client = await StemClient.create( + broker: StemBrokerFactory(create: connectBroker), + backend: StemBackendFactory.inMemory(), + tasks: tasks, + ); final forceFail = args.contains('--fail'); final overwrite = args.contains('--overwrite'); final runId = DateTime.now().millisecondsSinceEpoch.toString(); final taskId = overwrite ? 'task-context-mixed' : null; - final firstId = await stem.enqueue( + final firstId = await client.enqueue( 'demo.inline_parent', args: {'runId': runId, 'forceFail': forceFail}, enqueueOptions: TaskEnqueueOptions(taskId: taskId, queue: mixedQueue), @@ -23,7 +26,7 @@ Future main(List args) async { ); if (overwrite) { - final overwriteId = await stem.enqueue( + final overwriteId = await client.enqueue( 'demo.inline_parent', args: {'runId': '${runId}_overwrite', 'forceFail': forceFail}, enqueueOptions: TaskEnqueueOptions(taskId: taskId, queue: mixedQueue), @@ -31,5 +34,5 @@ Future main(List args) async { stdout.writeln('Overwrote task id=$overwriteId'); } - await broker.close(); + await client.close(); } diff --git a/packages/stem/example/task_context_mixed/lib/shared.dart b/packages/stem/example/task_context_mixed/lib/shared.dart index 55d28352..aff34176 100644 --- a/packages/stem/example/task_context_mixed/lib/shared.dart +++ b/packages/stem/example/task_context_mixed/lib/shared.dart @@ -218,12 +218,11 @@ class InlineCoordinatorTask extends TaskHandler { ), ); - final auditId = await context.enqueueCall( - auditDefinition.call( - AuditArgs( - runId: runId, - message: 'inline parent scheduled child tasks', - ), + final auditId = await auditDefinition.enqueue( + context, + AuditArgs( + runId: runId, + message: 'inline parent scheduled child tasks', ), ); @@ -251,12 +250,12 @@ class InlineCoordinatorTask extends TaskHandler { publishConnection: const {'adapter': 'sqlite'}, producer: const {'app': 'task-context-mixed'}, link: [ - linkSuccessDefinition.call( + linkSuccessDefinition.buildCall( {'runId': runId, 'source': 'link'}, ), ], linkError: [ - linkErrorDefinition.call( + linkErrorDefinition.buildCall( {'runId': runId, 'source': 'link_error'}, ), ], @@ -286,14 +285,13 @@ FutureOr inlineEntrypoint( '[inline_entrypoint] id=${context.id} attempt=${context.attempt} runId=$runId meta=${context.meta}', ); - await context.enqueueCall( - auditDefinition.call( - AuditArgs( - runId: runId, - message: 'inline entrypoint completed', - ), - enqueueOptions: const TaskEnqueueOptions(priority: 4), + await auditDefinition.enqueue( + context, + AuditArgs( + runId: runId, + message: 'inline entrypoint completed', ), + enqueueOptions: const TaskEnqueueOptions(priority: 4), ); return 'inline-ok'; @@ -308,21 +306,17 @@ FutureOr isolateChildEntrypoint( '[isolate_child] id=${context.id} attempt=${context.attempt} runId=$runId', ); - final call = context - .enqueueBuilder( - definition: auditDefinition, - args: AuditArgs( - runId: runId, - message: 'isolate child used enqueueBuilder', - ), - ) - .header('x-child', 'isolate') - .meta('origin', 'isolate-child') - .delay(const Duration(milliseconds: 200)) - .enqueueOptions(const TaskEnqueueOptions(shadow: 'audit-shadow')) - .build(); - - await context.enqueueCall(call); + await auditDefinition.enqueue( + context, + AuditArgs( + runId: runId, + message: 'isolate child used direct enqueue', + ), + headers: const {'x-child': 'isolate'}, + meta: const {'origin': 'isolate-child'}, + notBefore: stemNow().add(const Duration(milliseconds: 200)), + enqueueOptions: const TaskEnqueueOptions(shadow: 'audit-shadow'), + ); return 'isolate-ok'; } diff --git a/packages/stem/example/task_usage_patterns.dart b/packages/stem/example/task_usage_patterns.dart index b7879ebc..18a4d382 100644 --- a/packages/stem/example/task_usage_patterns.dart +++ b/packages/stem/example/task_usage_patterns.dart @@ -14,30 +14,39 @@ final childDefinition = TaskDefinition( metadata: const TaskMetadata(description: 'Typed child task example'), ); +final invocationParentDefinition = TaskDefinition.noArgs( + name: 'tasks.invocation_parent', +); + class ParentTask extends TaskHandler { + static final definition = TaskDefinition.noArgs( + name: 'tasks.parent', + metadata: TaskMetadata( + description: 'Parent task that enqueues follow-up work.', + ), + ); + @override - String get name => 'tasks.parent'; + String get name => definition.name; @override TaskOptions get options => const TaskOptions(queue: 'default'); @override - TaskMetadata get metadata => const TaskMetadata( - description: 'Parent task that enqueues follow-up work.', - ); + TaskMetadata get metadata => definition.metadata; @override Future call(TaskContext context, Map args) async { - await context.enqueue( - 'tasks.child', - args: {'value': 'from-parent'}, + await childDefinition.enqueue( + context, + const ChildArgs('from-parent'), enqueueOptions: TaskEnqueueOptions( countdown: const Duration(milliseconds: 200), queue: 'default', ), ); - await context.enqueueCall(childDefinition.call(const ChildArgs('typed'))); + await childDefinition.enqueue(context, const ChildArgs('typed')); } } @@ -45,7 +54,8 @@ FutureOr childEntrypoint( TaskInvocationContext context, Map args, ) { - final value = args['value'] as String? ?? 'unknown'; + final value = context.argOr('value', 'unknown'); + // Example output keeps the script runnable without adding logging setup. // ignore: avoid_print print('[child] value=$value attempt=${context.attempt}'); return 'ok'; @@ -55,16 +65,12 @@ FutureOr invocationParentEntrypoint( TaskInvocationContext context, Map args, ) async { - final call = context - .enqueueBuilder( - definition: childDefinition, - args: const ChildArgs('from-invocation-builder'), - ) - .priority(5) - .delay(const Duration(milliseconds: 100)) - .build(); - - await context.enqueueCall(call); + await childDefinition.enqueue( + context, + const ChildArgs('from-invocation-builder'), + options: const TaskOptions(priority: 5), + notBefore: stemNow().add(const Duration(milliseconds: 100)), + ); return null; } @@ -78,30 +84,41 @@ Future main() async { metadata: childDefinition.metadata, ), FunctionTaskHandler.inline( - name: 'tasks.invocation_parent', + name: invocationParentDefinition.name, entrypoint: invocationParentEntrypoint, options: const TaskOptions(queue: 'default'), + metadata: invocationParentDefinition.metadata, ), ]; - final broker = InMemoryBroker(); - final backend = InMemoryResultBackend(); - final worker = Worker( - broker: broker, - backend: backend, + final app = await StemApp.inMemory( tasks: tasks, - consumerName: 'example-worker', + workerConfig: const StemWorkerConfig(consumerName: 'example-worker'), ); - final stem = Stem(broker: broker, backend: backend, tasks: tasks); - unawaited(worker.start()); + await ParentTask.definition.enqueue(app); + await invocationParentDefinition.enqueue(app); + final directTaskId = await childDefinition.enqueue( + app, + const ChildArgs('direct-call'), + ); + final directResult = await childDefinition.waitFor( + app, + directTaskId, + timeout: const Duration(seconds: 1), + ); + // Example output keeps the script runnable without adding logging setup. + // ignore: avoid_print + print('[direct] result=${directResult?.value}'); - await stem.enqueue('tasks.parent', args: const {}); - await stem.enqueue('tasks.invocation_parent', args: const {}); - await stem.enqueueCall(childDefinition.call(const ChildArgs('direct-call'))); + final inlineResult = await childDefinition.enqueueAndWait( + app, + const ChildArgs('inline-wait'), + timeout: const Duration(seconds: 1), + ); + // Example output keeps the script runnable without adding logging setup. + // ignore: avoid_print + print('[inline] result=${inlineResult?.value}'); - await Future.delayed(const Duration(seconds: 1)); - await worker.shutdown(); - await backend.close(); - await broker.close(); + await app.close(); } diff --git a/packages/stem/example/unique_tasks/unique_task_example.dart b/packages/stem/example/unique_tasks/unique_task_example.dart index d42012ac..d440bb12 100644 --- a/packages/stem/example/unique_tasks/unique_task_example.dart +++ b/packages/stem/example/unique_tasks/unique_task_example.dart @@ -43,12 +43,6 @@ Future main() async { dbFile.createSync(recursive: true); } - final broker = InMemoryBroker(); - final backend = await SqliteResultBackend.open( - dbFile, - defaultTtl: const Duration(hours: 1), - groupDefaultTtl: const Duration(hours: 1), - ); final lockStore = InMemoryLockStore(); final coordinator = UniqueTaskCoordinator( lockStore: lockStore, @@ -59,26 +53,32 @@ Future main() async { final tasks = [SendDigestTask()]; // #region unique-task-stem-worker - final stem = Stem( - broker: broker, - backend: backend, + final client = await StemClient.create( + broker: StemBrokerFactory.inMemory(), + backend: StemBackendFactory( + create: () => SqliteResultBackend.open( + dbFile, + defaultTtl: const Duration(hours: 1), + groupDefaultTtl: const Duration(hours: 1), + ), + dispose: (backend) => backend.close(), + ), tasks: tasks, uniqueTaskCoordinator: coordinator, ); - final worker = Worker( - broker: broker, - backend: backend, - tasks: tasks, - uniqueTaskCoordinator: coordinator, - queue: 'email', - consumerName: 'unique-worker', + final worker = await client.createWorker( + workerConfig: StemWorkerConfig( + uniqueTaskCoordinator: coordinator, + queue: 'email', + consumerName: 'unique-worker', + ), ); // #endregion unique-task-stem-worker unawaited(worker.start()); // #region unique-task-enqueue - final firstId = await stem.enqueue( + final firstId = await client.enqueue( 'email.sendDigest', args: const {'userId': 42}, options: const TaskOptions( @@ -88,7 +88,7 @@ Future main() async { ), ); // #endregion unique-task-enqueue - final secondId = await stem.enqueue( + final secondId = await client.enqueue( 'email.sendDigest', args: const {'userId': 42}, options: const TaskOptions( @@ -104,6 +104,5 @@ Future main() async { await Future.delayed(const Duration(seconds: 2)); await worker.shutdown(); - await broker.close(); - await backend.close(); + await client.close(); } diff --git a/packages/stem/example/worker_control_lab/bin/producer.dart b/packages/stem/example/worker_control_lab/bin/producer.dart index 2f64a78c..b5ab3b3e 100644 --- a/packages/stem/example/worker_control_lab/bin/producer.dart +++ b/packages/stem/example/worker_control_lab/bin/producer.dart @@ -1,6 +1,7 @@ import 'dart:io'; import 'package:stem/stem.dart'; +import 'package:stem_redis/stem_redis.dart'; import 'package:stem_worker_control_lab/shared.dart'; Future main() async { @@ -19,14 +20,11 @@ Future main() async { '[producer] broker=$brokerUrl backend=$backendUrl long=$longCount quick=$quickCount', ); - final broker = await connectBroker(brokerUrl); - final backend = await connectBackend(backendUrl); - final tasks = buildTasks(); - - final stem = Stem( - broker: broker, - tasks: tasks, - backend: backend, + final client = await StemClient.fromUrl( + brokerUrl, + adapters: const [StemRedisAdapter()], + overrides: StemStoreOverrides(backend: backendUrl), + tasks: buildTasks(), ); final ids = []; @@ -34,7 +32,7 @@ Future main() async { for (var i = 0; i < longCount; i += 1) { final label = 'long-${i + 1}'; - final id = await stem.enqueue( + final id = await client.enqueue( 'control.long', options: taskOptions, args: {'label': label, 'steps': steps}, @@ -45,7 +43,7 @@ Future main() async { for (var i = 0; i < quickCount; i += 1) { final label = 'quick-${i + 1}'; - final id = await stem.enqueue( + final id = await client.enqueue( 'control.quick', options: taskOptions, args: {'label': label}, @@ -60,6 +58,5 @@ Future main() async { stdout.writeln('[producer] wrote ${ids.length} task ids to ${file.path}'); } - await broker.close(); - await backend.close(); + await client.close(); } diff --git a/packages/stem/example/workflows/basic_in_memory.dart b/packages/stem/example/workflows/basic_in_memory.dart index d81bf08b..08477098 100644 --- a/packages/stem/example/workflows/basic_in_memory.dart +++ b/packages/stem/example/workflows/basic_in_memory.dart @@ -4,19 +4,19 @@ import 'package:stem/stem.dart'; Future main() async { + final basicHello = Flow( + name: 'basic.hello', + build: (flow) { + flow.step('greet', (ctx) async => 'Hello Stem'); + }, + ); + final app = await StemWorkflowApp.inMemory( - flows: [ - Flow( - name: 'basic.hello', - build: (flow) { - flow.step('greet', (ctx) async => 'Hello Stem'); - }, - ), - ], + flows: [basicHello], ); - final runId = await app.startWorkflow('basic.hello'); - final result = await app.waitForCompletion(runId); + final runId = await basicHello.start(app); + final result = await basicHello.waitFor(app, runId); print('Workflow $runId finished with result: ${result?.value}'); await app.close(); diff --git a/packages/stem/example/workflows/cancellation_policy.dart b/packages/stem/example/workflows/cancellation_policy.dart index fe2a1267..10476274 100644 --- a/packages/stem/example/workflows/cancellation_policy.dart +++ b/packages/stem/example/workflows/cancellation_policy.dart @@ -9,30 +9,28 @@ import 'package:stem/stem.dart'; /// seconds, the runtime automatically cancels the run once the policy is /// exceeded. Operators can introspect the reason via `StemWorkflowApp`. Future main() async { + final reportsGenerate = Flow( + name: 'reports.generate', + build: (flow) { + flow.step('poll-status', (ctx) async { + if (!ctx.sleepUntilResumed(const Duration(seconds: 5))) { + print('[workflow] polling external system…'); + // Simulate a slow external service; the cancellation policy will + // cap this suspension to 2 seconds. + return null; + } + print('[workflow] resumed after sleep'); + return 'finished'; + }); + }, + ); + final app = await StemWorkflowApp.inMemory( - flows: [ - Flow( - name: 'reports.generate', - build: (flow) { - flow.step('poll-status', (ctx) async { - final resume = ctx.takeResumeValue(); - if (resume != true) { - print('[workflow] polling external system…'); - // Simulate a slow external service; the cancellation policy will - // cap this suspension to 2 seconds. - ctx.sleep(const Duration(seconds: 5)); - return null; - } - print('[workflow] resumed with payload: $resume'); - return 'finished'; - }); - }, - ), - ], + flows: [reportsGenerate], ); - final runId = await app.startWorkflow( - 'reports.generate', + final runId = await reportsGenerate.start( + app, cancellationPolicy: const WorkflowCancellationPolicy( maxRunDuration: Duration(minutes: 10), maxSuspendDuration: Duration(seconds: 2), diff --git a/packages/stem/example/workflows/custom_factories.dart b/packages/stem/example/workflows/custom_factories.dart index ee7a47b8..33e9238a 100644 --- a/packages/stem/example/workflows/custom_factories.dart +++ b/packages/stem/example/workflows/custom_factories.dart @@ -5,6 +5,12 @@ import 'package:stem/stem.dart'; import 'package:stem_redis/stem_redis.dart'; Future main() async { + final redisWorkflow = Flow( + name: 'redis.workflow', + build: (flow) { + flow.step('greet', (ctx) async => 'Redis-backed workflow'); + }, + ); final app = await StemWorkflowApp.fromUrl( 'redis://localhost:6379', adapters: const [StemRedisAdapter()], @@ -12,19 +18,12 @@ Future main() async { backend: 'redis://localhost:6379/1', workflow: 'redis://localhost:6379/2', ), - flows: [ - Flow( - name: 'redis.workflow', - build: (flow) { - flow.step('greet', (ctx) async => 'Redis-backed workflow'); - }, - ), - ], + flows: [redisWorkflow], ); try { - final runId = await app.startWorkflow('redis.workflow'); - final result = await app.waitForCompletion(runId); + final runId = await redisWorkflow.start(app); + final result = await redisWorkflow.waitFor(app, runId); print('Workflow $runId finished with result: ${result?.value}'); } finally { await app.close(); diff --git a/packages/stem/example/workflows/multiple_workers.dart b/packages/stem/example/workflows/multiple_workers.dart new file mode 100644 index 00000000..63a18d0c --- /dev/null +++ b/packages/stem/example/workflows/multiple_workers.dart @@ -0,0 +1,112 @@ +// Demonstrates one workflow routing tasks to multiple dedicated worker queues. +// Run with: dart run example/workflows/multiple_workers.dart + +import 'package:stem/stem.dart'; + +const String _workflowQueue = 'workflow'; +const String _notificationsQueue = 'notifications'; +const String _analyticsQueue = 'analytics'; + +final accountOnboardingFlow = Flow>( + name: 'workflow.multi_workers', + build: (flow) { + flow.step('dispatch-to-workers', (ctx) async { + final notifyTaskId = await ctx.enqueue( + 'notify.send', + args: const {'email': 'alex@example.com'}, + enqueueOptions: const TaskEnqueueOptions(queue: _notificationsQueue), + ); + final trackTaskId = await ctx.enqueue( + 'analytics.track', + args: const {'userId': 'alex', 'event': 'account.created'}, + enqueueOptions: const TaskEnqueueOptions(queue: _analyticsQueue), + ); + + return { + 'notifyTaskId': notifyTaskId, + 'trackTaskId': trackTaskId, + }; + }); + }, +); + +class NotifyTask extends TaskHandler { + @override + String get name => 'notify.send'; + + @override + TaskOptions get options => const TaskOptions(queue: _notificationsQueue); + + @override + Future call(TaskContext context, Map args) async { + final email = args['email'] as String? ?? 'unknown'; + print('[notifications worker] send notification -> $email'); + return 'notified:$email'; + } +} + +class AnalyticsTask extends TaskHandler { + @override + String get name => 'analytics.track'; + + @override + TaskOptions get options => const TaskOptions(queue: _analyticsQueue); + + @override + Future call(TaskContext context, Map args) async { + final userId = args['userId'] as String? ?? 'unknown'; + final event = args['event'] as String? ?? 'unknown'; + print('[analytics worker] track event "$event" for user "$userId"'); + return 'tracked:$event:$userId'; + } +} + +Future main() async { + final client = await StemClient.inMemory(); + final workflowApp = await client.createWorkflowApp( + flows: [accountOnboardingFlow], + workerConfig: const StemWorkerConfig(queue: _workflowQueue), + ); + await workflowApp.start(); + + final notificationsWorker = await client.createWorker( + workerConfig: StemWorkerConfig( + queue: 'notifications-worker', + consumerName: 'notifications-worker', + subscription: RoutingSubscription.singleQueue(_notificationsQueue), + ), + tasks: [NotifyTask()], + ); + final analyticsWorker = await client.createWorker( + workerConfig: StemWorkerConfig( + queue: 'analytics-worker', + consumerName: 'analytics-worker', + subscription: RoutingSubscription.singleQueue(_analyticsQueue), + ), + tasks: [AnalyticsTask()], + ); + + Future.wait([notificationsWorker.start(), analyticsWorker.start()]); + + final workflowResult = await accountOnboardingFlow.startAndWait(workflowApp); + final taskIds = workflowResult?.value ?? const {}; + final notifyResult = await workflowApp.waitForTask( + taskIds['notifyTaskId']!, + timeout: const Duration(seconds: 5), + ); + final trackResult = await workflowApp.waitForTask( + taskIds['trackTaskId']!, + timeout: const Duration(seconds: 5), + ); + + print('workflow ${workflowResult?.runId} complete'); + print('notifier: ${notifyResult?.value}'); + print('analytics: ${trackResult?.value}'); + + await Future.wait([ + notificationsWorker.shutdown(), + analyticsWorker.shutdown(), + workflowApp.close(), + client.close(), + ]); +} diff --git a/packages/stem/example/workflows/runtime_metadata_views.dart b/packages/stem/example/workflows/runtime_metadata_views.dart index f08ae774..f729da42 100644 --- a/packages/stem/example/workflows/runtime_metadata_views.dart +++ b/packages/stem/example/workflows/runtime_metadata_views.dart @@ -34,7 +34,7 @@ Future main() async { name: 'example.runtime.features', build: (flow) { flow.step('dispatch-task', (ctx) async { - await ctx.enqueuer!.enqueue( + await ctx.enqueue( 'example.noop', args: const {'payload': true}, meta: const {'origin': 'runtime_metadata_views'}, diff --git a/packages/stem/example/workflows/sleep_and_event.dart b/packages/stem/example/workflows/sleep_and_event.dart index a78ff94f..7fd9fae7 100644 --- a/packages/stem/example/workflows/sleep_and_event.dart +++ b/packages/stem/example/workflows/sleep_and_event.dart @@ -1,53 +1,50 @@ // Demonstrates sleep and external event resumption. // Run with: dart run example/workflows/sleep_and_event.dart +// ignore_for_file: avoid_print import 'dart:async'; import 'package:stem/stem.dart'; +const demoEvent = WorkflowEventRef>( + topic: 'demo.event', +); + Future main() async { + final sleepAndEvent = Flow( + name: 'durable.sleep.event', + build: (flow) { + flow + ..step('initial', (ctx) async { + await ctx.sleepFor(duration: const Duration(milliseconds: 200)); + return 'awake'; + }) + ..step('await-event', (ctx) async { + final payload = await demoEvent.wait(ctx); + return payload['message']; + }); + }, + ); + final app = await StemWorkflowApp.inMemory( - flows: [ - Flow( - name: 'durable.sleep.event', - build: (flow) { - flow.step('initial', (ctx) async { - final resumePayload = ctx.takeResumeValue(); - if (resumePayload != true) { - ctx.sleep(const Duration(milliseconds: 200)); - return null; - } - return 'awake'; - }); - - flow.step('await-event', (ctx) async { - final payload = ctx.takeResumeValue>(); - if (payload == null) { - ctx.awaitEvent('demo.event'); - return null; - } - return payload['message']; - }); - }, - ), - ], + flows: [sleepAndEvent], ); - final runId = await app.startWorkflow('durable.sleep.event'); + final runId = await sleepAndEvent.start(app); // Wait until the workflow is suspended before emitting the event to avoid // losing the signal. while (true) { final state = await app.getRun(runId); - if (state?.waitTopic == 'demo.event') { + if (state?.waitTopic == demoEvent.topic) { break; } await Future.delayed(const Duration(milliseconds: 50)); } - await app.runtime.emit('demo.event', {'message': 'event received'}); + await demoEvent.emit(app, const {'message': 'event received'}); - final result = await app.waitForCompletion(runId); + final result = await sleepAndEvent.waitFor(app, runId); print('Workflow $runId resumed and completed with: ${result?.value}'); await app.close(); diff --git a/packages/stem/example/workflows/sqlite_store.dart b/packages/stem/example/workflows/sqlite_store.dart index f3cda604..e549468d 100644 --- a/packages/stem/example/workflows/sqlite_store.dart +++ b/packages/stem/example/workflows/sqlite_store.dart @@ -8,22 +8,21 @@ import 'package:stem_sqlite/stem_sqlite.dart'; Future main() async { final databaseFile = File('workflow.sqlite'); + final sqliteExample = Flow( + name: 'sqlite.example', + build: (flow) { + flow.step('greet', (ctx) async => 'Persisted to SQLite'); + }, + ); final app = await StemWorkflowApp.fromUrl( 'sqlite://${databaseFile.path}', adapters: const [StemSqliteAdapter()], - flows: [ - Flow( - name: 'sqlite.example', - build: (flow) { - flow.step('greet', (ctx) async => 'Persisted to SQLite'); - }, - ), - ], + flows: [sqliteExample], ); try { - final runId = await app.startWorkflow('sqlite.example'); - final result = await app.waitForCompletion(runId); + final runId = await sqliteExample.start(app); + final result = await sqliteExample.waitFor(app, runId); print('Workflow $runId finished with result: ${result?.value}'); } finally { await app.close(); diff --git a/packages/stem/example/workflows/versioned_rewind.dart b/packages/stem/example/workflows/versioned_rewind.dart index 2f5c3e40..174a7115 100644 --- a/packages/stem/example/workflows/versioned_rewind.dart +++ b/packages/stem/example/workflows/versioned_rewind.dart @@ -2,36 +2,36 @@ import 'package:stem/stem.dart'; Future main() async { final iterations = []; + final versionedWorkflow = Flow( + name: 'demo.versioned', + build: (flow) { + flow.step('repeat', (ctx) async { + iterations.add(ctx.iteration); + return 'iteration-${ctx.iteration}'; + }, autoVersion: true); - final app = await StemWorkflowApp.inMemory( - flows: [ - Flow( - name: 'demo.versioned', - build: (flow) { - flow.step('repeat', (ctx) async { - iterations.add(ctx.iteration); - return 'iteration-${ctx.iteration}'; - }, autoVersion: true); + flow.step('tail', (ctx) async => ctx.previousResult); + }, + ); - flow.step('tail', (ctx) async => ctx.previousResult); - }, - ), - ], + final app = await StemWorkflowApp.inMemory( + flows: [versionedWorkflow], ); - final runId = await app.startWorkflow('demo.versioned'); - await app.runtime.executeRun(runId); + final runId = await versionedWorkflow.start(app); + await app.executeRun(runId); // Rewind and execute again to append a new iteration checkpoint. - await app.store.rewindToStep(runId, 'repeat'); - await app.store.markRunning(runId); - await app.runtime.executeRun(runId); + await app.rewindToCheckpoint(runId, 'repeat'); + await app.executeRun(runId); - final entries = await app.store.listSteps(runId); - for (final entry in entries) { - print('${entry.name}: ${entry.value}'); + final checkpoints = await app.viewCheckpoints(runId); + for (final checkpoint in checkpoints) { + print('${checkpoint.checkpointName}: ${checkpoint.value}'); } print('Iterations executed: $iterations'); + final completed = await versionedWorkflow.waitFor(app, runId); + print('Final result: ${completed?.value}'); await app.close(); } diff --git a/packages/stem/lib/src/bootstrap/factories.dart b/packages/stem/lib/src/bootstrap/factories.dart index 6ad9e45d..259d969f 100644 --- a/packages/stem/lib/src/bootstrap/factories.dart +++ b/packages/stem/lib/src/bootstrap/factories.dart @@ -228,4 +228,51 @@ class StemWorkerConfig { /// Optional payload signer used to verify envelopes. final PayloadSigner? signer; + + /// Returns a copy of this worker configuration with the provided overrides. + StemWorkerConfig copyWith({ + String? queue, + String? consumerName, + int? concurrency, + int? prefetchMultiplier, + int? prefetch, + RateLimiter? rateLimiter, + List? middleware, + RevokeStore? revokeStore, + UniqueTaskCoordinator? uniqueTaskCoordinator, + RetryStrategy? retryStrategy, + RoutingSubscription? subscription, + Duration? heartbeatInterval, + Duration? workerHeartbeatInterval, + HeartbeatTransport? heartbeatTransport, + String? heartbeatNamespace, + WorkerAutoscaleConfig? autoscale, + WorkerLifecycleConfig? lifecycle, + ObservabilityConfig? observability, + PayloadSigner? signer, + }) { + return StemWorkerConfig( + queue: queue ?? this.queue, + consumerName: consumerName ?? this.consumerName, + concurrency: concurrency ?? this.concurrency, + prefetchMultiplier: prefetchMultiplier ?? this.prefetchMultiplier, + prefetch: prefetch ?? this.prefetch, + rateLimiter: rateLimiter ?? this.rateLimiter, + middleware: middleware ?? this.middleware, + revokeStore: revokeStore ?? this.revokeStore, + uniqueTaskCoordinator: + uniqueTaskCoordinator ?? this.uniqueTaskCoordinator, + retryStrategy: retryStrategy ?? this.retryStrategy, + subscription: subscription ?? this.subscription, + heartbeatInterval: heartbeatInterval ?? this.heartbeatInterval, + workerHeartbeatInterval: + workerHeartbeatInterval ?? this.workerHeartbeatInterval, + heartbeatTransport: heartbeatTransport ?? this.heartbeatTransport, + heartbeatNamespace: heartbeatNamespace ?? this.heartbeatNamespace, + autoscale: autoscale ?? this.autoscale, + lifecycle: lifecycle ?? this.lifecycle, + observability: observability ?? this.observability, + signer: signer ?? this.signer, + ); + } } diff --git a/packages/stem/lib/src/bootstrap/stem_app.dart b/packages/stem/lib/src/bootstrap/stem_app.dart index ec76be59..cdec188c 100644 --- a/packages/stem/lib/src/bootstrap/stem_app.dart +++ b/packages/stem/lib/src/bootstrap/stem_app.dart @@ -1,12 +1,17 @@ +import 'dart:async'; + import 'package:stem/src/backend/encoding_result_backend.dart'; import 'package:stem/src/bootstrap/factories.dart'; import 'package:stem/src/bootstrap/stem_client.dart'; +import 'package:stem/src/bootstrap/stem_module.dart'; import 'package:stem/src/bootstrap/stem_stack.dart'; import 'package:stem/src/canvas/canvas.dart'; import 'package:stem/src/control/revoke_store.dart'; import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/core/stem.dart'; import 'package:stem/src/core/task_payload_encoder.dart'; +import 'package:stem/src/core/task_result.dart'; import 'package:stem/src/core/unique_task_coordinator.dart'; import 'package:stem/src/routing/routing_config.dart'; import 'package:stem/src/routing/routing_registry.dart'; @@ -15,26 +20,35 @@ import 'package:stem/src/worker/worker.dart'; import 'package:stem_memory/stem_memory.dart' show InMemoryRevokeStore; /// Convenience bootstrap for setting up a Stem runtime with sensible defaults. -class StemApp { +abstract interface class StemTaskApp implements TaskResultCaller {} + +/// Convenience bootstrap for setting up a Stem runtime with sensible defaults. +class StemApp implements StemTaskApp { StemApp._({ + required this.module, required this.registry, required this.broker, required this.backend, required this.stem, required this.worker, + required this.allowWorkerAutoStart, required List Function()> disposers, }) : _disposers = disposers { - canvas = Canvas( + canvas = _ManagedCanvas( broker: broker, backend: backend, registry: registry, encoderRegistry: stem.payloadEncoders, + onBeforeDispatch: _maybeAutoStart, ); } /// Task registry containing all registered handlers. final TaskRegistry registry; + /// Optional default bundle registered into this app. + final StemModule? module; + /// Active broker instance used by the helper. final Broker broker; @@ -47,16 +61,138 @@ class StemApp { /// Worker managed by the helper. final Worker worker; + /// Whether shortcut operations may lazily start the managed worker. + final bool allowWorkerAutoStart; + /// Canvas facade used for chains, groups, and chords. late final Canvas canvas; final List Function()> _disposers; bool _started = false; + Future? _startFuture; + + /// Whether the managed worker has been started. + bool get isStarted => _started; + + Future _maybeAutoStart() { + if (_started || !allowWorkerAutoStart) { + return Future.value(); + } + return start(); + } /// Registers an additional task handler with the underlying registry. void register(TaskHandler handler) => registry.register(handler); + /// Registers [handler] with the underlying registry. + void registerTask(TaskHandler handler) => register(handler); + + /// Registers [handlers] with the underlying registry. + void registerTasks(Iterable> handlers) { + handlers.forEach(register); + } + + /// Registers all task handlers from [module] into this app. + void registerModule(StemModule module) { + registerTasks(module.tasks); + } + + /// Registers all task handlers from [modules] into this app. + void registerModules(Iterable modules) { + final merged = StemModule.combine(modules: modules); + if (merged == null) { + return; + } + registerModule(merged); + } + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) async { + await _maybeAutoStart(); + return stem.enqueue( + name, + args: args, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) async { + await _maybeAutoStart(); + return stem.enqueueValue( + name, + value, + codec: codec, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) async { + await _maybeAutoStart(); + return stem.enqueueCall(call, enqueueOptions: enqueueOptions); + } + + @override + Future getTaskStatus(String taskId) async { + await _maybeAutoStart(); + return stem.getTaskStatus(taskId); + } + + @override + Future getGroupStatus(String groupId) async { + await _maybeAutoStart(); + return stem.getGroupStatus(groupId); + } + + @override + Future?> waitForTask( + String taskId, { + Duration? timeout, + TResult Function(Object? payload)? decode, + TResult Function(Map payload)? decodeJson, + TResult Function(Map payload, int version)? + decodeVersionedJson, + }) async { + await _maybeAutoStart(); + return stem.waitForTask( + taskId, + timeout: timeout, + decode: decode, + decodeJson: decodeJson, + decodeVersionedJson: decodeVersionedJson, + ); + } + void _insertAutoDisposers( List Function()> autoDisposers, ) { @@ -70,8 +206,24 @@ class StemApp { /// Starts the managed worker if it is not already running. Future start() async { if (_started) return; - _started = true; - await worker.start(); + final existing = _startFuture; + if (existing != null) { + await existing; + return; + } + + final completer = Completer(); + _startFuture = completer.future; + try { + await worker.start(); + _started = true; + completer.complete(); + } catch (error, stackTrace) { + _startFuture = null; + _started = false; + completer.completeError(error, stackTrace); + rethrow; + } } /// Shuts down the worker and disposes any managed resources. @@ -80,6 +232,7 @@ class StemApp { await disposer(); } _started = false; + _startFuture = null; } /// Alias for [shutdown]. @@ -87,6 +240,8 @@ class StemApp { /// Creates a new Stem application with the provided configuration. static Future create({ + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], TaskRegistry? registry, StemBrokerFactory? broker, @@ -102,9 +257,17 @@ class StemApp { TaskPayloadEncoder resultEncoder = const JsonTaskPayloadEncoder(), TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), Iterable additionalEncoders = const [], + bool allowWorkerAutoStart = true, }) async { + final effectiveModule = StemModule.combine( + module: module, + modules: modules, + ); + final bundledTasks = + effectiveModule?.tasks ?? const >[]; + final allTasks = [...bundledTasks, ...tasks]; final taskRegistry = registry ?? InMemoryTaskRegistry(); - tasks.forEach(taskRegistry.register); + registerModuleTaskHandlers(taskRegistry, allTasks); final brokerFactory = broker ?? StemBrokerFactory.inMemory(); final backendFactory = backend ?? StemBackendFactory.inMemory(); @@ -143,6 +306,18 @@ class StemApp { final workerUniqueTaskCoordinator = workerConfig.uniqueTaskCoordinator ?? uniqueTaskCoordinator; final workerSigner = workerConfig.signer ?? signer; + final inferredSubscription = + workerConfig.subscription ?? + effectiveModule?.inferTaskWorkerSubscription( + defaultQueue: workerConfig.queue, + additionalTasks: tasks, + ) ?? + (() { + final tempModule = StemModule(tasks: tasks); + return tempModule.inferTaskWorkerSubscription( + defaultQueue: workerConfig.queue, + ); + })(); final worker = Worker( broker: brokerInstance, @@ -154,7 +329,7 @@ class StemApp { uniqueTaskCoordinator: workerUniqueTaskCoordinator, retryStrategy: workerRetryStrategy, queue: workerConfig.queue, - subscription: workerConfig.subscription, + subscription: inferredSubscription, consumerName: workerConfig.consumerName, concurrency: workerConfig.concurrency, prefetchMultiplier: workerConfig.prefetchMultiplier, @@ -183,25 +358,32 @@ class StemApp { ]; return StemApp._( + module: effectiveModule, registry: taskRegistry, broker: brokerInstance, backend: encodedBackend, stem: stem, worker: worker, + allowWorkerAutoStart: allowWorkerAutoStart, disposers: disposers, ); } /// Creates an in-memory Stem application (broker + result backend). static Future inMemory({ + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], StemWorkerConfig workerConfig = const StemWorkerConfig(), TaskPayloadEncoderRegistry? encoderRegistry, TaskPayloadEncoder resultEncoder = const JsonTaskPayloadEncoder(), TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), Iterable additionalEncoders = const [], + bool allowWorkerAutoStart = true, }) { return StemApp.create( + module: module, + modules: modules, tasks: tasks, broker: StemBrokerFactory.inMemory(), backend: StemBackendFactory.inMemory(), @@ -210,6 +392,7 @@ class StemApp { resultEncoder: resultEncoder, argsEncoder: argsEncoder, additionalEncoders: additionalEncoders, + allowWorkerAutoStart: allowWorkerAutoStart, ); } @@ -219,6 +402,8 @@ class StemApp { /// can optionally auto-wire revoke and unique-task coordination stores. static Future fromUrl( String url, { + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], TaskRegistry? registry, Iterable adapters = const [], @@ -239,6 +424,7 @@ class StemApp { TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), Iterable additionalEncoders = const [], StemStack? stack, + bool allowWorkerAutoStart = true, }) async { final needsUniqueLockStore = uniqueTasks && @@ -292,6 +478,8 @@ class StemApp { try { final app = await create( + module: module, + modules: modules, tasks: tasks, registry: registry, broker: resolvedStack.broker, @@ -307,6 +495,7 @@ class StemApp { resultEncoder: resultEncoder, argsEncoder: argsEncoder, additionalEncoders: additionalEncoders, + allowWorkerAutoStart: allowWorkerAutoStart, ); // Dispose auto-provisioned lock/revoke stores after worker shutdown and @@ -331,14 +520,35 @@ class StemApp { /// Creates a Stem app using a shared [StemClient]. static Future fromClient( StemClient client, { + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], StemWorkerConfig workerConfig = const StemWorkerConfig(), + bool allowWorkerAutoStart = true, }) async { - tasks.forEach(client.taskRegistry.register); + final effectiveModule = + StemModule.combine(module: module, modules: modules) ?? client.module; + final bundledTasks = + effectiveModule?.tasks ?? const >[]; + final allTasks = [...bundledTasks, ...tasks]; + final taskRegistry = client.taskRegistry; + registerModuleTaskHandlers(taskRegistry, allTasks); + final inferredSubscription = + workerConfig.subscription ?? + effectiveModule?.inferTaskWorkerSubscription( + defaultQueue: workerConfig.queue, + additionalTasks: tasks, + ) ?? + (() { + final tempModule = StemModule(tasks: tasks); + return tempModule.inferTaskWorkerSubscription( + defaultQueue: workerConfig.queue, + ); + })(); final worker = Worker( broker: client.broker, - registry: client.taskRegistry, + registry: taskRegistry, backend: client.backend, enqueuer: client.stem, rateLimiter: workerConfig.rateLimiter, @@ -348,7 +558,7 @@ class StemApp { workerConfig.uniqueTaskCoordinator ?? client.uniqueTaskCoordinator, retryStrategy: workerConfig.retryStrategy ?? client.retryStrategy, queue: workerConfig.queue, - subscription: workerConfig.subscription, + subscription: inferredSubscription, consumerName: workerConfig.consumerName, concurrency: workerConfig.concurrency, prefetchMultiplier: workerConfig.prefetchMultiplier, @@ -365,11 +575,13 @@ class StemApp { ); return StemApp._( - registry: client.taskRegistry, + module: effectiveModule, + registry: taskRegistry, broker: client.broker, backend: client.backend, stem: client.stem, worker: worker, + allowWorkerAutoStart: allowWorkerAutoStart, disposers: [ () async { await worker.shutdown(); @@ -378,3 +590,63 @@ class StemApp { ); } } + +class _ManagedCanvas extends Canvas { + _ManagedCanvas({ + required super.broker, + required super.backend, + required super.registry, + required super.encoderRegistry, + required Future Function() onBeforeDispatch, + }) : _onBeforeDispatch = onBeforeDispatch; + + final Future Function() _onBeforeDispatch; + + @override + Future send(TaskSignature signature) async { + await _onBeforeDispatch(); + return super.send(signature); + } + + @override + Future> group( + List> signatures, { + String? groupId, + }) async { + await _onBeforeDispatch(); + return super.group(signatures, groupId: groupId); + } + + @override + Future submitBatch( + List> signatures, { + String? batchId, + Duration? ttl, + }) async { + await _onBeforeDispatch(); + return super.submitBatch(signatures, batchId: batchId, ttl: ttl); + } + + @override + Future> chain( + List> signatures, { + void Function(int index, TaskStatus status, T? value)? onStepCompleted, + }) async { + await _onBeforeDispatch(); + return super.chain(signatures, onStepCompleted: onStepCompleted); + } + + @override + Future> chord({ + required List> body, + required TaskSignature callback, + Duration pollInterval = const Duration(milliseconds: 100), + }) async { + await _onBeforeDispatch(); + return super.chord( + body: body, + callback: callback, + pollInterval: pollInterval, + ); + } +} diff --git a/packages/stem/lib/src/bootstrap/stem_client.dart b/packages/stem/lib/src/bootstrap/stem_client.dart index 6a7fd57d..c94e0e83 100644 --- a/packages/stem/lib/src/bootstrap/stem_client.dart +++ b/packages/stem/lib/src/bootstrap/stem_client.dart @@ -3,9 +3,12 @@ import 'package:stem/src/bootstrap/stem_app.dart'; import 'package:stem/src/bootstrap/stem_module.dart'; import 'package:stem/src/bootstrap/stem_stack.dart'; import 'package:stem/src/bootstrap/workflow_app.dart'; +import 'package:stem/src/canvas/canvas.dart'; import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/core/stem.dart'; import 'package:stem/src/core/task_payload_encoder.dart'; +import 'package:stem/src/core/task_result.dart'; import 'package:stem/src/core/unique_task_coordinator.dart'; import 'package:stem/src/routing/routing_config.dart'; import 'package:stem/src/routing/routing_registry.dart'; @@ -18,9 +21,11 @@ import 'package:stem/src/workflow/runtime/workflow_introspection.dart'; import 'package:stem/src/workflow/runtime/workflow_registry.dart'; /// Shared entrypoint that owns broker/backend configuration for Stem runtimes. -abstract class StemClient { +abstract class StemClient implements TaskResultCaller { /// Creates a client using the provided factories and defaults. static Future create({ + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], TaskRegistry? taskRegistry, WorkflowRegistry? workflowRegistry, @@ -39,6 +44,8 @@ abstract class StemClient { }) async { return _DefaultStemClient.create( tasks: tasks, + module: module, + modules: modules, taskRegistry: taskRegistry, workflowRegistry: workflowRegistry, broker: broker, @@ -58,6 +65,8 @@ abstract class StemClient { /// Creates an in-memory client using in-memory broker/backend. static Future inMemory({ + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], StemWorkerConfig defaultWorkerConfig = const StemWorkerConfig(), TaskPayloadEncoderRegistry? encoderRegistry, @@ -66,6 +75,8 @@ abstract class StemClient { Iterable additionalEncoders = const [], }) { return create( + module: module, + modules: modules, tasks: tasks, broker: StemBrokerFactory.inMemory(), backend: StemBackendFactory.inMemory(), @@ -83,6 +94,8 @@ abstract class StemClient { /// can avoid manual factory wiring for common Redis/Postgres/SQLite setups. static Future fromUrl( String url, { + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], TaskRegistry? taskRegistry, WorkflowRegistry? workflowRegistry, @@ -104,7 +117,51 @@ abstract class StemClient { adapters: adapters, overrides: overrides, ); + return fromStack( + stack, + module: module, + modules: modules, + tasks: tasks, + taskRegistry: taskRegistry, + workflowRegistry: workflowRegistry, + routing: routing, + retryStrategy: retryStrategy, + uniqueTaskCoordinator: uniqueTaskCoordinator, + middleware: middleware, + signer: signer, + defaultWorkerConfig: defaultWorkerConfig, + encoderRegistry: encoderRegistry, + resultEncoder: resultEncoder, + argsEncoder: argsEncoder, + additionalEncoders: additionalEncoders, + ); + } + + /// Creates a client from a pre-resolved [StemStack]. + /// + /// Use this when adapter resolution is managed elsewhere and the client + /// should reuse that broker/backend stack directly. + static Future fromStack( + StemStack stack, { + StemModule? module, + Iterable modules = const [], + Iterable> tasks = const [], + TaskRegistry? taskRegistry, + WorkflowRegistry? workflowRegistry, + RoutingRegistry? routing, + RetryStrategy? retryStrategy, + UniqueTaskCoordinator? uniqueTaskCoordinator, + Iterable middleware = const [], + PayloadSigner? signer, + StemWorkerConfig defaultWorkerConfig = const StemWorkerConfig(), + TaskPayloadEncoderRegistry? encoderRegistry, + TaskPayloadEncoder resultEncoder = const JsonTaskPayloadEncoder(), + TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), + Iterable additionalEncoders = const [], + }) { return create( + module: module, + modules: modules, tasks: tasks, taskRegistry: taskRegistry, workflowRegistry: workflowRegistry, @@ -135,9 +192,93 @@ abstract class StemClient { /// Shared workflow registry for workflow definitions. WorkflowRegistry get workflowRegistry; + /// Optional default bundle registered into this client. + StemModule? get module; + /// Enqueue facade for producers. Stem get stem; + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return stem.enqueue( + name, + args: args, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return stem.enqueueValue( + name, + value, + codec: codec, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future getTaskStatus(String taskId) { + return stem.getTaskStatus(taskId); + } + + @override + Future getGroupStatus(String groupId) { + return stem.getGroupStatus(groupId); + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) { + return stem.enqueueCall(call, enqueueOptions: enqueueOptions); + } + + /// Waits for a task result by task id using the client's shared backend. + @override + Future?> waitForTask( + String taskId, { + Duration? timeout, + TResult Function(Object? payload)? decode, + TResult Function(Map payload)? decodeJson, + TResult Function(Map payload, int version)? + decodeVersionedJson, + }) { + return stem.waitForTask( + taskId, + timeout: timeout, + decode: decode, + decodeJson: decodeJson, + decodeVersionedJson: decodeVersionedJson, + ); + } + /// Payload encoder registry used for task args/results. TaskPayloadEncoderRegistry get encoderRegistry; @@ -165,7 +306,21 @@ abstract class StemClient { Iterable> tasks = const [], }) async { final config = workerConfig ?? defaultWorkerConfig; - tasks.forEach(taskRegistry.register); + final bundledTasks = module?.tasks ?? const >[]; + final allTasks = [...bundledTasks, ...tasks]; + registerModuleTaskHandlers(taskRegistry, allTasks); + final inferredSubscription = + config.subscription ?? + module?.inferTaskWorkerSubscription( + defaultQueue: config.queue, + additionalTasks: tasks, + ) ?? + (() { + final tempModule = StemModule(tasks: tasks); + return tempModule.inferTaskWorkerSubscription( + defaultQueue: config.queue, + ); + })(); return Worker( broker: broker, registry: taskRegistry, @@ -178,7 +333,7 @@ abstract class StemClient { config.uniqueTaskCoordinator ?? uniqueTaskCoordinator, retryStrategy: config.retryStrategy ?? retryStrategy, queue: config.queue, - subscription: config.subscription, + subscription: inferredSubscription, consumerName: config.consumerName, concurrency: config.concurrency, prefetchMultiplier: config.prefetchMultiplier, @@ -195,43 +350,74 @@ abstract class StemClient { ); } + /// Creates a canvas using the shared broker/backend/registry. + Canvas createCanvas({ + Iterable> tasks = const [], + }) { + final bundledTasks = module?.tasks ?? const >[]; + final allTasks = [...bundledTasks, ...tasks]; + registerModuleTaskHandlers(taskRegistry, allTasks); + return Canvas( + broker: broker, + backend: backend, + registry: taskRegistry, + encoderRegistry: encoderRegistry, + ); + } + /// Creates a workflow app using the shared client configuration. Future createWorkflowApp({ StemModule? module, + Iterable modules = const [], Iterable workflows = const [], Iterable flows = const [], Iterable scripts = const [], WorkflowStoreFactory? storeFactory, WorkflowEventBusFactory? eventBusFactory, StemWorkerConfig workerConfig = const StemWorkerConfig(queue: 'workflow'), + String? continuationQueue, + String? executionQueue, Duration pollInterval = const Duration(milliseconds: 500), Duration leaseExtension = const Duration(seconds: 30), WorkflowIntrospectionSink? introspectionSink, + bool allowWorkerAutoStart = true, }) { + final effectiveModule = + StemModule.combine(module: module, modules: modules) ?? this.module; return StemWorkflowApp.fromClient( client: this, - module: module, + module: effectiveModule, workflows: workflows, flows: flows, scripts: scripts, storeFactory: storeFactory, eventBusFactory: eventBusFactory, workerConfig: workerConfig, + continuationQueue: continuationQueue, + executionQueue: executionQueue, pollInterval: pollInterval, leaseExtension: leaseExtension, introspectionSink: introspectionSink, + allowWorkerAutoStart: allowWorkerAutoStart, ); } /// Creates a StemApp wrapper using the shared client configuration. Future createApp({ + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], StemWorkerConfig? workerConfig, + bool allowWorkerAutoStart = true, }) { + final effectiveModule = + StemModule.combine(module: module, modules: modules) ?? this.module; return StemApp.fromClient( this, + module: effectiveModule, tasks: tasks, workerConfig: workerConfig ?? defaultWorkerConfig, + allowWorkerAutoStart: allowWorkerAutoStart, ); } @@ -245,6 +431,7 @@ class _DefaultStemClient extends StemClient { required this.backend, required this.taskRegistry, required this.workflowRegistry, + required this.module, required this.stem, required this.encoderRegistry, required this.routing, @@ -258,6 +445,8 @@ class _DefaultStemClient extends StemClient { }) : middleware = List.unmodifiable(middleware); static Future create({ + StemModule? module, + Iterable modules = const [], Iterable> tasks = const [], TaskRegistry? taskRegistry, WorkflowRegistry? workflowRegistry, @@ -274,9 +463,17 @@ class _DefaultStemClient extends StemClient { TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), Iterable additionalEncoders = const [], }) async { + final effectiveModule = StemModule.combine( + module: module, + modules: modules, + ); + final bundledTasks = + effectiveModule?.tasks ?? const >[]; + final allTasks = [...bundledTasks, ...tasks]; final registry = taskRegistry ?? InMemoryTaskRegistry(); - tasks.forEach(registry.register); + registerModuleTaskHandlers(registry, allTasks); final workflows = workflowRegistry ?? InMemoryWorkflowRegistry(); + effectiveModule?.registerInto(workflows: workflows); final brokerFactory = broker ?? StemBrokerFactory.inMemory(); final backendFactory = backend ?? StemBackendFactory.inMemory(); @@ -305,6 +502,7 @@ class _DefaultStemClient extends StemClient { backend: backendInstance, taskRegistry: registry, workflowRegistry: workflows, + module: effectiveModule, stem: stem, encoderRegistry: stem.payloadEncoders, routing: stem.routing, @@ -330,6 +528,9 @@ class _DefaultStemClient extends StemClient { @override final WorkflowRegistry workflowRegistry; + @override + final StemModule? module; + @override final Stem stem; @@ -363,3 +564,44 @@ class _DefaultStemClient extends StemClient { await disposeBackend(); } } + +/// Convenience helpers for bootstrapping clients from a resolved [StemStack]. +extension StemStackClientBootstrap on StemStack { + /// Creates a client using this resolved broker/backend stack. + Future createClient({ + StemModule? module, + Iterable modules = const [], + Iterable> tasks = const [], + TaskRegistry? taskRegistry, + WorkflowRegistry? workflowRegistry, + RoutingRegistry? routing, + RetryStrategy? retryStrategy, + UniqueTaskCoordinator? uniqueTaskCoordinator, + Iterable middleware = const [], + PayloadSigner? signer, + StemWorkerConfig defaultWorkerConfig = const StemWorkerConfig(), + TaskPayloadEncoderRegistry? encoderRegistry, + TaskPayloadEncoder resultEncoder = const JsonTaskPayloadEncoder(), + TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), + Iterable additionalEncoders = const [], + }) { + return StemClient.fromStack( + this, + module: module, + modules: modules, + tasks: tasks, + taskRegistry: taskRegistry, + workflowRegistry: workflowRegistry, + routing: routing, + retryStrategy: retryStrategy, + uniqueTaskCoordinator: uniqueTaskCoordinator, + middleware: middleware, + signer: signer, + defaultWorkerConfig: defaultWorkerConfig, + encoderRegistry: encoderRegistry, + resultEncoder: resultEncoder, + argsEncoder: argsEncoder, + additionalEncoders: additionalEncoders, + ); + } +} diff --git a/packages/stem/lib/src/bootstrap/stem_module.dart b/packages/stem/lib/src/bootstrap/stem_module.dart index 4c6748cf..9252b833 100644 --- a/packages/stem/lib/src/bootstrap/stem_module.dart +++ b/packages/stem/lib/src/bootstrap/stem_module.dart @@ -1,3 +1,6 @@ +import 'dart:collection'; +import 'dart:convert'; + import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/workflow/core/flow.dart'; import 'package:stem/src/workflow/core/workflow_definition.dart'; @@ -5,6 +8,21 @@ import 'package:stem/src/workflow/core/workflow_script.dart'; import 'package:stem/src/workflow/runtime/workflow_manifest.dart'; import 'package:stem/src/workflow/runtime/workflow_registry.dart'; +/// Registers task handlers while tolerating re-registration of identical +/// handler instances. +void registerModuleTaskHandlers( + TaskRegistry registry, + Iterable> handlers, +) { + for (final handler in handlers) { + final existing = registry.resolve(handler.name); + if (identical(existing, handler)) { + continue; + } + registry.register(handler); + } +} + /// Generated or hand-authored bundle of tasks and workflow definitions. /// /// The intended use is to pass one module into bootstrap helpers rather than @@ -30,6 +48,127 @@ class StemModule { ), ); + /// Merges [modules] into one bundled module. + /// + /// Duplicate task or workflow names must resolve to the same underlying + /// object instance. Distinct definitions with the same public name fail fast + /// so module composition never silently overrides behavior. + factory StemModule.merge(Iterable modules) { + final mergedWorkflows = []; + final mergedFlows = []; + final mergedScripts = []; + final mergedTasks = >[]; + final mergedManifest = []; + final workflowDefinitionsByName = {}; + final taskHandlersByName = >{}; + final manifestEntriesByName = {}; + + void addWorkflowDefinition( + WorkflowDefinition definition, { + required String source, + required void Function() onFirstSeen, + }) { + final existing = workflowDefinitionsByName[definition.name]; + if (existing == null) { + workflowDefinitionsByName[definition.name] = definition; + onFirstSeen(); + return; + } + if (!identical(existing, definition)) { + throw ArgumentError( + 'Workflow "${definition.name}" is declared by multiple modules ' + 'with different definitions ($source).', + ); + } + } + + void addTaskHandler(TaskHandler handler) { + final existing = taskHandlersByName[handler.name]; + if (existing == null) { + taskHandlersByName[handler.name] = handler; + mergedTasks.add(handler); + return; + } + if (!identical(existing, handler)) { + throw ArgumentError( + 'Task handler "${handler.name}" is declared by multiple modules ' + 'with different handlers.', + ); + } + } + + void addManifestEntry(WorkflowManifestEntry entry) { + final existing = manifestEntriesByName[entry.name]; + if (existing == null) { + manifestEntriesByName[entry.name] = entry; + mergedManifest.add(entry); + return; + } + if (!_sameManifestEntry(existing, entry)) { + throw ArgumentError( + 'Workflow manifest entry "${entry.name}" conflicts across merged ' + 'modules.', + ); + } + } + + for (final module in modules) { + for (final workflow in module.workflows) { + addWorkflowDefinition( + workflow, + source: 'workflow definition', + onFirstSeen: () => mergedWorkflows.add(workflow), + ); + } + for (final flow in module.flows) { + addWorkflowDefinition( + flow.definition, + source: 'flow', + onFirstSeen: () => mergedFlows.add(flow), + ); + } + for (final script in module.scripts) { + addWorkflowDefinition( + script.definition, + source: 'script', + onFirstSeen: () => mergedScripts.add(script), + ); + } + module.tasks.forEach(addTaskHandler); + module.workflowManifest.forEach(addManifestEntry); + } + + return StemModule( + workflows: mergedWorkflows, + flows: mergedFlows, + scripts: mergedScripts, + tasks: mergedTasks, + workflowManifest: mergedManifest, + ); + } + + /// Combines an optional singular [module] and plural [modules] input. + /// + /// Returns `null` when no modules are supplied. When exactly one module is + /// present it is returned unchanged. Otherwise the modules are merged with + /// the same conflict detection as [StemModule.merge]. + static StemModule? combine({ + StemModule? module, + Iterable modules = const [], + }) { + final combined = [ + ?module, + ...modules, + ]; + if (combined.isEmpty) { + return null; + } + if (combined.length == 1) { + return combined.single; + } + return StemModule.merge(combined); + } + /// Raw workflow definitions that are not represented as [Flow] or /// [WorkflowScript] instances. final List workflows; @@ -70,6 +209,161 @@ class StemModule { } } + /// Returns the default queues implied by the bundled task handlers. + /// + /// The [workflowQueue] is always included so workflow orchestration remains + /// runnable when the inferred queues are used to bootstrap a worker. + List inferredWorkerQueues({ + String workflowQueue = 'workflow', + String? continuationQueue, + String? executionQueue, + Iterable> additionalTasks = const [], + }) { + final queues = SplayTreeSet(); + final normalizedWorkflowQueue = workflowQueue.trim(); + if (normalizedWorkflowQueue.isNotEmpty) { + queues.add(normalizedWorkflowQueue); + } + final normalizedContinuationQueue = continuationQueue?.trim(); + if (normalizedContinuationQueue != null && + normalizedContinuationQueue.isNotEmpty) { + queues.add(normalizedContinuationQueue); + } + final normalizedExecutionQueue = executionQueue?.trim(); + if (normalizedExecutionQueue != null && + normalizedExecutionQueue.isNotEmpty) { + queues.add(normalizedExecutionQueue); + } + + void addTaskQueue(TaskHandler handler) { + final queue = handler.options.queue.trim(); + if (queue.isNotEmpty) { + queues.add(queue); + } + } + + tasks.forEach(addTaskQueue); + additionalTasks.forEach(addTaskQueue); + return queues.toList(growable: false); + } + + /// Returns the queues required to run workflow orchestration plus bundled + /// tasks. + /// + /// This is the explicit inspection helper for workflow-capable workers. + /// Bootstrap helpers use the same queue set when inferring workflow worker + /// subscriptions from a module. + List requiredWorkflowQueues({ + String workflowQueue = 'workflow', + String? continuationQueue, + String? executionQueue, + Iterable> additionalTasks = const [], + }) { + return inferredWorkerQueues( + workflowQueue: workflowQueue, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + additionalTasks: additionalTasks, + ); + } + + /// Returns the explicit subscription required for workflow-capable workers. + RoutingSubscription requiredWorkflowSubscription({ + String workflowQueue = 'workflow', + String? continuationQueue, + String? executionQueue, + Iterable> additionalTasks = const [], + }) { + final queues = requiredWorkflowQueues( + workflowQueue: workflowQueue, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + additionalTasks: additionalTasks, + ); + if (queues.length == 1) { + return RoutingSubscription.singleQueue(queues.single); + } + return RoutingSubscription(queues: queues); + } + + /// Infers a worker subscription from the bundled task handlers. + /// + /// Returns `null` when only the [workflowQueue] is needed, allowing the + /// worker's default queue configuration to remain unchanged. + RoutingSubscription? inferWorkerSubscription({ + String workflowQueue = 'workflow', + String? continuationQueue, + String? executionQueue, + Iterable> additionalTasks = const [], + }) { + final queues = inferredWorkerQueues( + workflowQueue: workflowQueue, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + additionalTasks: additionalTasks, + ); + if (queues.length <= 1) { + return null; + } + return RoutingSubscription(queues: queues); + } + + /// Returns the default queues implied by the bundled task handlers only. + List inferredTaskQueues({ + Iterable> additionalTasks = const [], + }) { + final queues = SplayTreeSet(); + + void addTaskQueue(TaskHandler handler) { + final queue = handler.options.queue.trim(); + if (queue.isNotEmpty) { + queues.add(queue); + } + } + + tasks.forEach(addTaskQueue); + additionalTasks.forEach(addTaskQueue); + return queues.toList(growable: false); + } + + /// Returns the queues required by bundled task handlers only. + /// + /// This is the explicit inspection helper for task-only workers. Bootstrap + /// helpers use the same queue set when inferring plain worker subscriptions + /// from a module. + List requiredTaskQueues({ + Iterable> additionalTasks = const [], + }) { + return inferredTaskQueues(additionalTasks: additionalTasks); + } + + /// Returns the explicit subscription required for task-only workers. + RoutingSubscription requiredTaskSubscription({ + Iterable> additionalTasks = const [], + }) { + final queues = requiredTaskQueues(additionalTasks: additionalTasks); + if (queues.length == 1) { + return RoutingSubscription.singleQueue(queues.single); + } + return RoutingSubscription(queues: queues); + } + + /// Infers a worker subscription from bundled task handlers only. + /// + /// Returns `null` when the bundled tasks only target [defaultQueue], allowing + /// the worker's default queue configuration to remain unchanged. + RoutingSubscription? inferTaskWorkerSubscription({ + String defaultQueue = 'default', + Iterable> additionalTasks = const [], + }) { + final queues = inferredTaskQueues(additionalTasks: additionalTasks); + if (queues.isEmpty) return null; + if (queues.length == 1 && queues.first == defaultQueue.trim()) { + return null; + } + return RoutingSubscription(queues: queues); + } + static Iterable _defaultManifest({ required Iterable workflows, required Iterable flows, @@ -86,3 +380,7 @@ class StemModule { } } } + +bool _sameManifestEntry(WorkflowManifestEntry a, WorkflowManifestEntry b) { + return jsonEncode(a.toJson()) == jsonEncode(b.toJson()); +} diff --git a/packages/stem/lib/src/bootstrap/workflow_app.dart b/packages/stem/lib/src/bootstrap/workflow_app.dart index c5254e23..abbe7344 100644 --- a/packages/stem/lib/src/bootstrap/workflow_app.dart +++ b/packages/stem/lib/src/bootstrap/workflow_app.dart @@ -1,3 +1,5 @@ +import 'dart:async'; + import 'package:stem/src/bootstrap/factories.dart'; import 'package:stem/src/bootstrap/stem_app.dart'; import 'package:stem/src/bootstrap/stem_client.dart'; @@ -5,35 +7,50 @@ import 'package:stem/src/bootstrap/stem_module.dart'; import 'package:stem/src/bootstrap/stem_stack.dart'; import 'package:stem/src/control/revoke_store.dart'; import 'package:stem/src/core/clock.dart'; -import 'package:stem/src/core/contracts.dart' show TaskHandler; +import 'package:stem/src/core/contracts.dart' + show + GroupStatus, + TaskCall, + TaskEnqueueOptions, + TaskHandler, + TaskOptions, + TaskStatus; import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/core/task_payload_encoder.dart'; +import 'package:stem/src/core/task_result.dart'; import 'package:stem/src/core/unique_task_coordinator.dart'; import 'package:stem/src/workflow/core/event_bus.dart'; import 'package:stem/src/workflow/core/flow.dart'; import 'package:stem/src/workflow/core/run_state.dart'; import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; import 'package:stem/src/workflow/core/workflow_definition.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; import 'package:stem/src/workflow/core/workflow_ref.dart'; import 'package:stem/src/workflow/core/workflow_result.dart'; import 'package:stem/src/workflow/core/workflow_script.dart'; import 'package:stem/src/workflow/core/workflow_status.dart'; import 'package:stem/src/workflow/core/workflow_store.dart'; +import 'package:stem/src/workflow/core/workflow_watcher.dart'; import 'package:stem/src/workflow/runtime/workflow_introspection.dart'; +import 'package:stem/src/workflow/runtime/workflow_manifest.dart'; import 'package:stem/src/workflow/runtime/workflow_registry.dart'; import 'package:stem/src/workflow/runtime/workflow_runtime.dart'; +import 'package:stem/src/workflow/runtime/workflow_views.dart'; /// Helper that bootstraps a workflow runtime on top of [StemApp]. /// /// This wrapper wires together broker/backend infrastructure, registers flows, /// and exposes convenience helpers for scheduling and observing workflow runs /// without having to manage [WorkflowRuntime] directly. -class StemWorkflowApp { +class StemWorkflowApp + implements WorkflowCaller, WorkflowEventEmitter, StemTaskApp { StemWorkflowApp._({ required this.app, required this.runtime, required this.store, required this.eventBus, + required this.allowWorkerAutoStart, + required this.ownsStemApp, required Future Function() disposeStore, required Future Function() disposeBus, }) : _disposeStore = disposeStore, @@ -51,10 +68,33 @@ class StemWorkflowApp { /// Event bus used to deliver workflow events. final EventBus eventBus; + /// Whether shortcut operations may lazily start the managed worker. + final bool allowWorkerAutoStart; + + /// Whether this wrapper owns the provided [app] and may shut it down. + final bool ownsStemApp; + final Future Function() _disposeStore; final Future Function() _disposeBus; - bool _started = false; + bool _runtimeStarted = false; + Future? _runtimeStartFuture; + + /// Whether both the runtime and managed worker have been started. + bool get isStarted => isRuntimeStarted && isWorkerStarted; + + /// Whether the workflow runtime has been started. + bool get isRuntimeStarted => _runtimeStarted; + + /// Whether the managed worker has been started. + bool get isWorkerStarted => app.isStarted; + + Future _ensureReadyForWorkflowStart() async { + await startRuntime(); + if (allowWorkerAutoStart) { + await startWorker(); + } + } /// Starts the workflow runtime and the underlying Stem worker. /// @@ -67,16 +107,123 @@ class StemWorkflowApp { /// await app.start(); /// ``` Future start() async { - if (_started) return; - _started = true; - await runtime.start(); - await app.start(); + await startRuntime(); + await startWorker(); + } + + /// Starts the workflow runtime without starting the managed worker. + Future startRuntime() async { + if (_runtimeStarted) return; + final existing = _runtimeStartFuture; + if (existing != null) { + await existing; + return; + } + + final completer = Completer(); + _runtimeStartFuture = completer.future; + try { + await runtime.start(); + _runtimeStarted = true; + completer.complete(); + } catch (error, stackTrace) { + _runtimeStartFuture = null; + _runtimeStarted = false; + completer.completeError(error, stackTrace); + rethrow; + } + } + + /// Starts the managed worker used for workflow execution. + Future startWorker() { + return app.start(); + } + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return app.enqueue( + name, + args: args, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return app.enqueueValue( + name, + value, + codec: codec, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) { + return app.enqueueCall(call, enqueueOptions: enqueueOptions); + } + + @override + Future getTaskStatus(String taskId) { + return app.getTaskStatus(taskId); + } + + @override + Future getGroupStatus(String groupId) { + return app.getGroupStatus(groupId); + } + + @override + Future?> waitForTask( + String taskId, { + Duration? timeout, + TResult Function(Object? payload)? decode, + TResult Function(Map payload)? decodeJson, + TResult Function(Map payload, int version)? + decodeVersionedJson, + }) { + return app.waitForTask( + taskId, + timeout: timeout, + decode: decode, + decodeJson: decodeJson, + decodeVersionedJson: decodeVersionedJson, + ); } /// Schedules a workflow run. /// /// Lazily starts the runtime on the first invocation so simple examples do - /// not need to call [start] manually. + /// not need to call [start] manually. The managed worker is only auto-started + /// when [allowWorkerAutoStart] is `true`. /// /// Example: /// ```dart @@ -95,18 +242,8 @@ class StemWorkflowApp { /// Optional policy that enforces automatic run cancellation. WorkflowCancellationPolicy? cancellationPolicy, - }) { - if (!_started) { - return start().then( - (_) => runtime.startWorkflow( - name, - params: params, - parentRunId: parentRunId, - ttl: ttl, - cancellationPolicy: cancellationPolicy, - ), - ); - } + }) async { + await _ensureReadyForWorkflowStart(); return runtime.startWorkflow( name, params: params, @@ -116,25 +253,82 @@ class StemWorkflowApp { ); } + /// Starts a workflow from a DTO that already exposes `toJson()`. + Future startWorkflowJson( + String name, + T paramsJson, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + String? typeName, + }) async { + await _ensureReadyForWorkflowStart(); + return runtime.startWorkflowJson( + name, + paramsJson, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + typeName: typeName, + ); + } + + /// Starts a workflow from a typed value plus optional [codec]. + /// + /// When [codec] is omitted, [value] must already be a string-keyed durable + /// map payload. + Future startWorkflowValue( + String name, + T value, { + PayloadCodec? codec, + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) async { + await _ensureReadyForWorkflowStart(); + return runtime.startWorkflowValue( + name, + value, + codec: codec, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + /// Starts a workflow from a DTO and stores a schema [version] beside the + /// JSON payload. + Future startWorkflowVersionedJson( + String name, + T paramsJson, { + required int version, + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + String? typeName, + }) async { + await _ensureReadyForWorkflowStart(); + return runtime.startWorkflowVersionedJson( + name, + paramsJson, + version: version, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + typeName: typeName, + ); + } + /// Schedules a workflow run from a typed [WorkflowRef]. + @override Future startWorkflowRef( WorkflowRef definition, TParams params, { String? parentRunId, Duration? ttl, WorkflowCancellationPolicy? cancellationPolicy, - }) { - if (!_started) { - return start().then( - (_) => runtime.startWorkflowRef( - definition, - params, - parentRunId: parentRunId, - ttl: ttl, - cancellationPolicy: cancellationPolicy, - ), - ); - } + }) async { + await _ensureReadyForWorkflowStart(); return runtime.startWorkflowRef( definition, params, @@ -144,7 +338,37 @@ class StemWorkflowApp { ); } + /// Emits a DTO-backed external event without requiring a manual payload map. + Future emitJson( + String topic, + T payloadJson, { + String? typeName, + }) { + return runtime.emitJson( + topic, + payloadJson, + typeName: typeName, + ); + } + + /// Emits a DTO-backed external event and stores a schema [version] beside + /// the JSON payload. + Future emitVersionedJson( + String topic, + T payloadJson, { + required int version, + String? typeName, + }) { + return runtime.emitVersionedJson( + topic, + payloadJson, + version: version, + typeName: typeName, + ); + } + /// Schedules a workflow run from a prebuilt [WorkflowStartCall]. + @override Future startWorkflowCall( WorkflowStartCall call, ) { @@ -160,6 +384,7 @@ class StemWorkflowApp { /// Emits a typed event to resume runs waiting on [topic]. /// /// This is a convenience wrapper over [WorkflowRuntime.emitValue]. + @override Future emitValue( String topic, T value, { @@ -168,6 +393,12 @@ class StemWorkflowApp { return runtime.emitValue(topic, value, codec: codec); } + /// Emits a typed event through a [WorkflowEventRef]. + @override + Future emitEvent(WorkflowEventRef event, T value) { + return runtime.emitEvent(event, value); + } + /// Returns the current [RunState] of a workflow run, or `null` if not found. /// /// Example: @@ -181,6 +412,127 @@ class StemWorkflowApp { /// ``` Future getRun(String runId) => store.get(runId); + /// Registers all tasks and workflows from [module] into this app. + /// + /// This is a convenience helper for manual registration flows that need to + /// attach generated definitions after bootstrap. + void registerModule(StemModule module) { + registerModuleTaskHandlers(app.registry, module.tasks); + module.registerInto(workflows: runtime.registry); + } + + /// Registers all tasks and workflows from [modules] into this app. + void registerModules(Iterable modules) { + final merged = StemModule.combine(modules: modules); + if (merged == null) { + return; + } + registerModule(merged); + } + + /// Registers [definition] into this app's workflow registry. + void registerWorkflow(WorkflowDefinition definition) { + runtime.registerWorkflow(definition); + } + + /// Registers [definitions] into this app's workflow registry. + void registerWorkflows(Iterable definitions) { + definitions.forEach(registerWorkflow); + } + + /// Registers [flow] into this app's workflow registry. + void registerFlow(Flow flow) { + registerWorkflow(flow.definition); + } + + /// Registers [flows] into this app's workflow registry. + void registerFlows(Iterable flows) { + flows.forEach(registerFlow); + } + + /// Registers [script] into this app's workflow registry. + void registerScript(WorkflowScript script) { + registerWorkflow(script.definition); + } + + /// Registers [scripts] into this app's workflow registry. + void registerScripts(Iterable scripts) { + scripts.forEach(registerScript); + } + + /// Returns the normalized run view for [runId], or `null` if not found. + Future viewRun(String runId) { + return runtime.viewRun(runId); + } + + /// Returns persisted checkpoint views for [runId]. + Future> viewCheckpoints(String runId) { + return runtime.viewCheckpoints(runId); + } + + /// Returns the combined run + checkpoint detail view for [runId]. + /// + /// This is a convenience wrapper over [WorkflowRuntime.viewRunDetail] so + /// callers do not need to reach through [runtime] for common inspection. + Future viewRunDetail(String runId) { + return runtime.viewRunDetail(runId); + } + + /// Returns normalized workflow run views filtered by workflow/status. + Future> listRunViews({ + String? workflow, + WorkflowStatus? status, + int limit = 50, + int offset = 0, + }) { + return runtime.listRunViews( + workflow: workflow, + status: status, + limit: limit, + offset: offset, + ); + } + + /// Returns the manifest entries for workflows registered with this app. + List workflowManifest() { + return runtime.workflowManifest(); + } + + /// Executes the workflow run identified by [runId]. + /// + /// This is a convenience wrapper over [WorkflowRuntime.executeRun] for + /// examples and application code that need direct run driving without + /// reaching through [runtime]. + Future executeRun(String runId) { + return runtime.executeRun(runId); + } + + /// Rewinds [runId] to [checkpointName] and marks it runnable again. + /// + /// This is a convenience wrapper for replay-oriented workflows that need to + /// resume execution from an earlier persisted checkpoint. + Future rewindToCheckpoint(String runId, String checkpointName) async { + await store.rewindToStep(runId, checkpointName); + await store.markRunning(runId); + } + + /// Lists event watchers registered for [topic]. + Future> listWatchers(String topic) { + return store.listWatchers(topic); + } + + /// Marks all runs due at [now] as resumed and returns their ids. + Future> resumeDueRuns([DateTime? now]) async { + final due = await store.dueRuns(now ?? DateTime.now()); + final resumed = []; + for (final runId in due) { + final state = await store.get(runId); + await store.markResumed(runId, data: state?.suspensionData); + resumed.add(runId); + } + return resumed; + } + /// Polls the workflow store until the run reaches a terminal state. /// /// When the workflow completes successfully the persisted result is surfaced @@ -202,7 +554,16 @@ class StemWorkflowApp { Duration pollInterval = const Duration(milliseconds: 100), Duration? timeout, T Function(Object? payload)? decode, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, }) async { + assert( + [decode, decodeJson, decodeVersionedJson] + .whereType() + .length <= + 1, + 'Specify at most one of decode, decodeJson, or decodeVersionedJson.', + ); final startedAt = stemNow(); while (true) { final state = await store.get(runId); @@ -210,20 +571,31 @@ class StemWorkflowApp { return null; } if (state.isTerminal) { - return _buildResult(state, decode, timedOut: false); + return _buildResult( + state, + decode, + decodeJson: decodeJson, + decodeVersionedJson: decodeVersionedJson, + timedOut: false, + ); } if (timeout != null && stemNow().difference(startedAt) >= timeout) { - return _buildResult(state, decode, timedOut: true); + return _buildResult( + state, + decode, + decodeJson: decodeJson, + decodeVersionedJson: decodeVersionedJson, + timedOut: true, + ); } await Future.delayed(pollInterval); } } /// Waits for [runId] using the decoding rules from a [WorkflowRef]. - Future?> waitForWorkflowRef< - TParams, - TResult extends Object? - >( + @override + Future?> + waitForWorkflowRef( String runId, WorkflowRef definition, { Duration pollInterval = const Duration(milliseconds: 100), @@ -241,9 +613,16 @@ class StemWorkflowApp { RunState state, T Function(Object? payload)? decode, { required bool timedOut, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, }) { final value = state.status == WorkflowStatus.completed - ? _decodeResult(state.result, decode) + ? _decodeResult( + state.result, + decode, + decodeJson, + decodeVersionedJson, + ) : null; return WorkflowResult( runId: state.id, @@ -258,10 +637,23 @@ class StemWorkflowApp { T? _decodeResult( Object? payload, T Function(Object? payload)? decode, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, ) { if (decode != null) { return decode(payload); } + if (decodeVersionedJson != null) { + return decodeVersionedJson( + PayloadCodec.decodeJsonMap(payload, typeName: 'workflow result'), + PayloadCodec.readPayloadVersion(payload), + ); + } + if (decodeJson != null) { + return decodeJson( + PayloadCodec.decodeJsonMap(payload, typeName: 'workflow result'), + ); + } return payload as T?; } @@ -276,10 +668,13 @@ class StemWorkflowApp { /// ``` Future shutdown() async { await runtime.dispose(); - await app.shutdown(); + if (ownsStemApp) { + await app.shutdown(); + } await _disposeBus(); await _disposeStore(); - _started = false; + _runtimeStarted = false; + _runtimeStartFuture = null; } /// Alias for [shutdown]. @@ -288,7 +683,10 @@ class StemWorkflowApp { /// Creates a workflow app with custom backends and factories. /// /// Useful for wiring Redis/Postgres adapters or sharing an existing - /// [StemApp] instance with job processors. + /// [StemApp] instance with job processors. When [module] or [tasks] are + /// provided and [StemWorkerConfig.subscription] is omitted, the helper + /// infers a worker subscription that includes the workflow queue plus the + /// default queues declared on those task handlers. /// /// Example: /// ```dart @@ -300,6 +698,7 @@ class StemWorkflowApp { /// ``` static Future create({ StemModule? module, + Iterable modules = const [], Iterable workflows = const [], Iterable flows = const [], Iterable scripts = const [], @@ -310,6 +709,8 @@ class StemWorkflowApp { WorkflowStoreFactory? storeFactory, WorkflowEventBusFactory? eventBusFactory, StemWorkerConfig workerConfig = const StemWorkerConfig(queue: 'workflow'), + String? continuationQueue, + String? executionQueue, Duration pollInterval = const Duration(milliseconds: 500), Duration leaseExtension = const Duration(seconds: 30), WorkflowRegistry? workflowRegistry, @@ -318,21 +719,41 @@ class StemWorkflowApp { TaskPayloadEncoder resultEncoder = const JsonTaskPayloadEncoder(), TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), Iterable additionalEncoders = const [], + bool allowWorkerAutoStart = true, + bool ownsStemApp = false, }) async { - final moduleTasks = module?.tasks ?? const >[]; + final effectiveModule = + StemModule.combine(module: module, modules: modules) ?? stemApp?.module; + final moduleTasks = + effectiveModule?.tasks ?? const >[]; final moduleWorkflowDefinitions = - module?.workflowDefinitions ?? const []; + effectiveModule?.workflowDefinitions ?? const []; + final resolvedWorkerConfig = _resolveWorkflowWorkerConfig( + workerConfig, + module: effectiveModule, + tasks: tasks, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + ); final appInstance = stemApp ?? await StemApp.create( broker: broker ?? StemBrokerFactory.inMemory(), backend: backend ?? StemBackendFactory.inMemory(), - workerConfig: workerConfig, + workerConfig: resolvedWorkerConfig, encoderRegistry: encoderRegistry, resultEncoder: resultEncoder, argsEncoder: argsEncoder, additionalEncoders: additionalEncoders, + allowWorkerAutoStart: allowWorkerAutoStart, ); + if (stemApp != null) { + _validateReusableStemApp( + appInstance, + resolvedWorkerConfig, + allowWorkerAutoStart: allowWorkerAutoStart, + ); + } final storeFactoryInstance = storeFactory ?? WorkflowStoreFactory.inMemory(); @@ -346,12 +767,19 @@ class StemWorkflowApp { eventBus: eventBus, pollInterval: pollInterval, leaseExtension: leaseExtension, - queue: workerConfig.queue, + queue: resolvedWorkerConfig.queue, + continuationQueue: continuationQueue, + executionQueue: executionQueue, registry: workflowRegistry, introspectionSink: introspectionSink, ); - [...moduleTasks, ...tasks].forEach(appInstance.register); + registerModuleTaskHandlers( + appInstance.registry, + [...moduleTasks, ...tasks], + ); + appInstance.worker.workflows = runtime; + appInstance.worker.workflowEvents = runtime; appInstance.register(runtime.workflowRunnerHandler()); [ @@ -366,6 +794,8 @@ class StemWorkflowApp { runtime: runtime, store: store, eventBus: eventBus, + allowWorkerAutoStart: allowWorkerAutoStart, + ownsStemApp: stemApp == null || ownsStemApp, disposeStore: () async => storeFactoryInstance.dispose(store), disposeBus: () async => busFactory.dispose(eventBus), ); @@ -374,6 +804,10 @@ class StemWorkflowApp { /// Creates an in-memory workflow app (in-memory broker, backend, and store). /// /// Ideal for unit tests and examples since it requires no external services. + /// When [module] or [tasks] are provided and + /// [StemWorkerConfig.subscription] is omitted, the helper infers a worker + /// subscription that includes the workflow queue plus the default queues + /// declared on those task handlers. /// /// Example: /// ```dart @@ -383,11 +817,14 @@ class StemWorkflowApp { /// ``` static Future inMemory({ StemModule? module, + Iterable modules = const [], Iterable workflows = const [], Iterable flows = const [], Iterable scripts = const [], Iterable> tasks = const [], StemWorkerConfig workerConfig = const StemWorkerConfig(queue: 'workflow'), + String? continuationQueue, + String? executionQueue, Duration pollInterval = const Duration(milliseconds: 500), Duration leaseExtension = const Duration(seconds: 30), WorkflowRegistry? workflowRegistry, @@ -396,9 +833,11 @@ class StemWorkflowApp { TaskPayloadEncoder resultEncoder = const JsonTaskPayloadEncoder(), TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), Iterable additionalEncoders = const [], + bool allowWorkerAutoStart = true, }) { return StemWorkflowApp.create( module: module, + modules: modules, workflows: workflows, flows: flows, scripts: scripts, @@ -408,6 +847,8 @@ class StemWorkflowApp { storeFactory: WorkflowStoreFactory.inMemory(), eventBusFactory: WorkflowEventBusFactory.inMemory(), workerConfig: workerConfig, + continuationQueue: continuationQueue, + executionQueue: executionQueue, pollInterval: pollInterval, leaseExtension: leaseExtension, workflowRegistry: workflowRegistry, @@ -416,16 +857,21 @@ class StemWorkflowApp { resultEncoder: resultEncoder, argsEncoder: argsEncoder, additionalEncoders: additionalEncoders, + allowWorkerAutoStart: allowWorkerAutoStart, ); } /// Creates a workflow app from a single backend URL plus adapter wiring. /// /// This wires broker/backend and workflow-store factories from one URL and - /// optional per-store overrides via [StemStack.fromUrl]. + /// optional per-store overrides via [StemStack.fromUrl]. When [module] or + /// [tasks] are provided and [StemWorkerConfig.subscription] is omitted, the + /// helper infers a worker subscription that includes the workflow queue plus + /// the default queues declared on those task handlers. static Future fromUrl( String url, { StemModule? module, + Iterable modules = const [], Iterable workflows = const [], Iterable flows = const [], Iterable scripts = const [], @@ -433,6 +879,8 @@ class StemWorkflowApp { Iterable adapters = const [], StemStoreOverrides overrides = const StemStoreOverrides(), StemWorkerConfig workerConfig = const StemWorkerConfig(queue: 'workflow'), + String? continuationQueue, + String? executionQueue, bool uniqueTasks = false, Duration uniqueTaskDefaultTtl = const Duration(minutes: 5), String uniqueTaskNamespace = 'stem:unique', @@ -448,7 +896,15 @@ class StemWorkflowApp { TaskPayloadEncoder resultEncoder = const JsonTaskPayloadEncoder(), TaskPayloadEncoder argsEncoder = const JsonTaskPayloadEncoder(), Iterable additionalEncoders = const [], + bool allowWorkerAutoStart = true, }) async { + final resolvedWorkerConfig = _resolveWorkflowWorkerConfig( + workerConfig, + module: StemModule.combine(module: module, modules: modules), + tasks: tasks, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + ); final stack = StemStack.fromUrl( url, adapters: adapters, @@ -461,7 +917,7 @@ class StemWorkflowApp { adapters: adapters, overrides: overrides, stack: stack, - workerConfig: workerConfig, + workerConfig: resolvedWorkerConfig, uniqueTasks: uniqueTasks, uniqueTaskDefaultTtl: uniqueTaskDefaultTtl, uniqueTaskNamespace: uniqueTaskNamespace, @@ -472,11 +928,13 @@ class StemWorkflowApp { resultEncoder: resultEncoder, argsEncoder: argsEncoder, additionalEncoders: additionalEncoders, + allowWorkerAutoStart: allowWorkerAutoStart, ); try { return await create( module: module, + modules: modules, workflows: workflows, flows: flows, scripts: scripts, @@ -484,11 +942,15 @@ class StemWorkflowApp { stemApp: app, storeFactory: stack.workflowStore, eventBusFactory: eventBusFactory, - workerConfig: workerConfig, + workerConfig: resolvedWorkerConfig, + continuationQueue: continuationQueue, + executionQueue: executionQueue, pollInterval: pollInterval, leaseExtension: leaseExtension, workflowRegistry: workflowRegistry, introspectionSink: introspectionSink, + allowWorkerAutoStart: allowWorkerAutoStart, + ownsStemApp: true, ); } on Object catch (error, stackTrace) { // fromUrl owns the app instance; clean it up when workflow bootstrap @@ -503,9 +965,15 @@ class StemWorkflowApp { } /// Creates a workflow app backed by a shared [StemClient]. + /// + /// When [module] or [tasks] are provided and + /// [StemWorkerConfig.subscription] is omitted, the helper infers a worker + /// subscription that includes the workflow queue plus the default queues + /// declared on those task handlers. static Future fromClient({ required StemClient client, StemModule? module, + Iterable modules = const [], Iterable workflows = const [], Iterable flows = const [], Iterable scripts = const [], @@ -513,75 +981,180 @@ class StemWorkflowApp { WorkflowStoreFactory? storeFactory, WorkflowEventBusFactory? eventBusFactory, StemWorkerConfig workerConfig = const StemWorkerConfig(queue: 'workflow'), + String? continuationQueue, + String? executionQueue, Duration pollInterval = const Duration(milliseconds: 500), Duration leaseExtension = const Duration(seconds: 30), WorkflowIntrospectionSink? introspectionSink, + bool allowWorkerAutoStart = true, }) async { + final effectiveModule = + StemModule.combine(module: module, modules: modules) ?? client.module; + final resolvedWorkerConfig = _resolveWorkflowWorkerConfig( + workerConfig, + module: effectiveModule, + tasks: tasks, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + ); final appInstance = await StemApp.fromClient( client, - workerConfig: workerConfig, + workerConfig: resolvedWorkerConfig, + allowWorkerAutoStart: allowWorkerAutoStart, ); return StemWorkflowApp.create( - module: module, + module: effectiveModule, workflows: workflows, flows: flows, scripts: scripts, stemApp: appInstance, storeFactory: storeFactory, eventBusFactory: eventBusFactory, - workerConfig: workerConfig, + workerConfig: resolvedWorkerConfig, + continuationQueue: continuationQueue, + executionQueue: executionQueue, pollInterval: pollInterval, leaseExtension: leaseExtension, workflowRegistry: client.workflowRegistry, introspectionSink: introspectionSink, + allowWorkerAutoStart: allowWorkerAutoStart, + ownsStemApp: true, ); } } -/// Convenience helpers for typed workflow start calls. -extension WorkflowStartCallAppExtension - on WorkflowStartCall { - /// Starts this workflow call with [app]. - Future startWithApp(StemWorkflowApp app) { - return app.startWorkflowCall(this); +/// Convenience helpers for layering workflows onto an existing [StemApp]. +extension StemAppWorkflowExtension on StemApp { + /// Creates a workflow app on top of this shared task app. + /// + /// This reuses the existing broker/backend/worker wiring, so the current + /// worker must already subscribe to the workflow queue and any task queues + /// required by the supplied module or tasks. + Future createWorkflowApp({ + StemModule? module, + Iterable modules = const [], + Iterable workflows = const [], + Iterable flows = const [], + Iterable scripts = const [], + Iterable> tasks = const [], + WorkflowStoreFactory? storeFactory, + WorkflowEventBusFactory? eventBusFactory, + StemWorkerConfig workerConfig = const StemWorkerConfig(queue: 'workflow'), + String? continuationQueue, + String? executionQueue, + Duration pollInterval = const Duration(milliseconds: 500), + Duration leaseExtension = const Duration(seconds: 30), + WorkflowRegistry? workflowRegistry, + WorkflowIntrospectionSink? introspectionSink, + bool allowWorkerAutoStart = true, + }) { + return StemWorkflowApp.create( + module: + StemModule.combine(module: module, modules: modules) ?? this.module, + workflows: workflows, + flows: flows, + scripts: scripts, + tasks: tasks, + stemApp: this, + storeFactory: storeFactory, + eventBusFactory: eventBusFactory, + workerConfig: workerConfig, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + pollInterval: pollInterval, + leaseExtension: leaseExtension, + workflowRegistry: workflowRegistry, + introspectionSink: introspectionSink, + allowWorkerAutoStart: allowWorkerAutoStart, + ); } +} - /// Starts this workflow call with [app] and waits for the typed result. - Future?> startAndWaitWithApp( - StemWorkflowApp app, { - Duration pollInterval = const Duration(milliseconds: 100), - Duration? timeout, - }) async { - final runId = await app.startWorkflowCall(this); - return definition.waitFor( - app, - runId, - pollInterval: pollInterval, - timeout: timeout, +void _validateReusableStemApp( + StemApp app, + StemWorkerConfig workerConfig, + { + required bool allowWorkerAutoStart, + } +) { + if (app.allowWorkerAutoStart != allowWorkerAutoStart) { + throw StateError( + 'StemWorkflowApp.create(stemApp: ...) requires the reused StemApp ' + 'to use the same allowWorkerAutoStart setting. Create the StemApp with ' + 'allowWorkerAutoStart: $allowWorkerAutoStart or omit stemApp so the ' + 'workflow app can create a matching shortcut wrapper.', ); } - /// Starts this workflow call with [runtime]. - Future startWithRuntime(WorkflowRuntime runtime) { - return runtime.startWorkflowCall(this); + final requiredQueues = + workerConfig.subscription?.resolveQueues( + workerConfig.queue, + ) ?? + [workerConfig.queue]; + final workerQueues = app.worker.subscriptionQueues.toSet(); + final missingQueues = requiredQueues + .map((queue) => queue.trim()) + .where((queue) => queue.isNotEmpty) + .where((queue) => !workerQueues.contains(queue)) + .toList(growable: false); + + final requiredBroadcasts = + workerConfig.subscription?.broadcastChannels ?? const []; + final workerBroadcasts = app.worker.subscriptionBroadcasts.toSet(); + final missingBroadcasts = requiredBroadcasts + .map((channel) => channel.trim()) + .where((channel) => channel.isNotEmpty) + .where((channel) => !workerBroadcasts.contains(channel)) + .toList(growable: false); + + if (missingQueues.isEmpty && missingBroadcasts.isEmpty) { + return; } + + final details = [ + if (missingQueues.isNotEmpty) 'queues=${missingQueues.join(",")}', + if (missingBroadcasts.isNotEmpty) + 'broadcasts=${missingBroadcasts.join(",")}', + ].join(' '); + + throw StateError( + 'StemWorkflowApp.create(stemApp: ...) requires the reused StemApp worker ' + 'to already subscribe to the workflow/runtime queues it needs ($details). ' + 'Create the StemApp with a matching workerConfig.subscription, or use ' + 'StemClient.createWorkflowApp(...) / StemWorkflowApp.inMemory(...) so ' + 'subscriptions can be inferred automatically.', + ); } -/// Convenience helpers for waiting on workflow results using a typed reference. -extension WorkflowRefAppExtension - on WorkflowRef { - /// Waits for [runId] using this workflow reference's decode rules. - Future?> waitFor( - StemWorkflowApp app, - String runId, { - Duration pollInterval = const Duration(milliseconds: 100), - Duration? timeout, - }) { - return app.waitForWorkflowRef( - runId, - this, - pollInterval: pollInterval, - timeout: timeout, - ); +StemWorkerConfig _resolveWorkflowWorkerConfig( + StemWorkerConfig workerConfig, { + StemModule? module, + Iterable> tasks = const [], + String? continuationQueue, + String? executionQueue, +}) { + if (workerConfig.subscription != null) { + return workerConfig; + } + + final inferredSubscription = + module?.inferWorkerSubscription( + workflowQueue: workerConfig.queue, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + additionalTasks: tasks, + ) ?? + (() { + final tempModule = StemModule(tasks: tasks); + return tempModule.inferWorkerSubscription( + workflowQueue: workerConfig.queue, + continuationQueue: continuationQueue, + executionQueue: executionQueue, + ); + })(); + + if (inferredSubscription == null) { + return workerConfig; } + return workerConfig.copyWith(subscription: inferredSubscription); } diff --git a/packages/stem/lib/src/canvas/canvas.dart b/packages/stem/lib/src/canvas/canvas.dart index c9dac1ce..6ee5c1e4 100644 --- a/packages/stem/lib/src/canvas/canvas.dart +++ b/packages/stem/lib/src/canvas/canvas.dart @@ -331,7 +331,7 @@ class Canvas { String? groupId, }) async { final id = groupId ?? _generateId('grp'); - if (groupId == null) { + if (groupId == null || await backend.getGroup(id) == null) { await backend.initGroup( GroupDescriptor(id: id, expected: signatures.length), ); @@ -934,7 +934,7 @@ extension TaskDefinitionCanvasX Map? meta, TResult Function(Object? payload)? decode, }) { - final call = this.call( + final call = buildCall( args, headers: headers, options: options, diff --git a/packages/stem/lib/src/control/control_messages.dart b/packages/stem/lib/src/control/control_messages.dart index cc01f3b2..80b33eef 100644 --- a/packages/stem/lib/src/control/control_messages.dart +++ b/packages/stem/lib/src/control/control_messages.dart @@ -1,4 +1,6 @@ import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; /// Control-plane command dispatched to worker control queues. class ControlCommandMessage { @@ -34,6 +36,53 @@ class ControlCommandMessage { /// Arbitrary command payload. final Map payload; + /// Returns the decoded payload value for [key], or `null` when absent. + T? payloadValue(String key, {PayloadCodec? codec}) { + return payload.value(key, codec: codec); + } + + /// Returns the decoded payload value for [key], or [fallback] when absent. + T payloadValueOr(String key, T fallback, {PayloadCodec? codec}) { + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded payload value for [key], throwing when absent. + T requiredPayloadValue(String key, {PayloadCodec? codec}) { + return payload.requiredValue(key, codec: codec); + } + + /// Decodes the full payload as a typed DTO with [codec]. + T payloadAs({required PayloadCodec codec}) { + return codec.decode(payload); + } + + /// Decodes the full payload as a typed DTO with a JSON decoder. + T payloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full payload as a typed DTO with a version-aware JSON + /// decoder. + T payloadVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + /// Optional timeout for the command, in milliseconds. final int? timeoutMs; @@ -81,9 +130,117 @@ class ControlReplyMessage { /// Arbitrary reply payload. final Map payload; + /// Returns the decoded payload value for [key], or `null` when absent. + T? payloadValue(String key, {PayloadCodec? codec}) { + return payload.value(key, codec: codec); + } + + /// Returns the decoded payload value for [key], or [fallback] when absent. + T payloadValueOr(String key, T fallback, {PayloadCodec? codec}) { + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded payload value for [key], throwing when absent. + T requiredPayloadValue(String key, {PayloadCodec? codec}) { + return payload.requiredValue(key, codec: codec); + } + + /// Decodes the full payload as a typed DTO with [codec]. + T payloadAs({required PayloadCodec codec}) { + return codec.decode(payload); + } + + /// Decodes the full payload as a typed DTO with a JSON decoder. + T payloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full payload as a typed DTO with a version-aware JSON + /// decoder. + T payloadVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + /// Optional error payload. final Map? error; + /// Returns the decoded error value for [key], or `null` when absent. + T? errorValue(String key, {PayloadCodec? codec}) { + final payload = error; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Returns the decoded error value for [key], or [fallback] when absent. + T errorValueOr(String key, T fallback, {PayloadCodec? codec}) { + final payload = error; + if (payload == null) return fallback; + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded error value for [key], throwing when absent. + T requiredErrorValue(String key, {PayloadCodec? codec}) { + final payload = error; + if (payload == null) { + throw StateError('ControlReplyMessage.error does not contain "$key".'); + } + return payload.requiredValue(key, codec: codec); + } + + /// Decodes the full error payload as a typed DTO with [codec]. + T? errorAs({required PayloadCodec codec}) { + final payload = error; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the full error payload as a typed DTO with a JSON decoder. + T? errorJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = error; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full error payload as a typed DTO with a version-aware JSON + /// decoder. + T? errorVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = error; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + /// Serializes the reply into a map payload. Map toMap() => { 'requestId': requestId, diff --git a/packages/stem/lib/src/core/clock.dart b/packages/stem/lib/src/core/clock.dart index c8e90e92..30e681ea 100644 --- a/packages/stem/lib/src/core/clock.dart +++ b/packages/stem/lib/src/core/clock.dart @@ -1,6 +1,7 @@ import 'dart:async'; /// Shared clock abstraction used across the Stem ecosystem. +// ignore: one_member_abstracts abstract class StemClock { /// Creates a clock implementation. const StemClock(); diff --git a/packages/stem/lib/src/core/contracts.dart b/packages/stem/lib/src/core/contracts.dart index 4975f2a4..0ea0ef94 100644 --- a/packages/stem/lib/src/core/contracts.dart +++ b/packages/stem/lib/src/core/contracts.dart @@ -33,12 +33,17 @@ library; import 'dart:async'; import 'dart:collection'; -import 'package:stem/src/core/clock.dart'; import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; import 'package:stem/src/core/task_invocation.dart'; import 'package:stem/src/core/task_payload_encoder.dart'; import 'package:stem/src/observability/heartbeat.dart'; import 'package:stem/src/scheduler/schedule_spec.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; /// Subscription describing the queues and broadcast channels a worker should /// consume from. @@ -251,12 +256,107 @@ class TaskStatus { /// The payload associated with this task, if any. final Object? payload; + /// Returns the decoded payload value, or `null` when no payload is present. + /// + /// When [codec] is supplied, the stored durable payload is decoded through + /// that codec before being returned. + T? payloadValue({PayloadCodec? codec}) { + final stored = payload; + if (stored == null) return null; + if (codec != null) { + return codec.decode(stored); + } + return stored as T; + } + + /// Decodes the entire payload as a typed DTO with [codec]. + T? payloadAs({required PayloadCodec codec}) { + final stored = payload; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the entire payload as a typed DTO with a JSON decoder. + T? payloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final stored = payload; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the entire payload as a typed DTO with a version-aware JSON + /// decoder. + T? payloadVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = payload; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + + /// Returns the decoded payload value, or [fallback] when it is absent. + T payloadValueOr(T fallback, {PayloadCodec? codec}) { + return payloadValue(codec: codec) ?? fallback; + } + + /// Returns the decoded payload value, throwing when it is absent. + T requiredPayloadValue({PayloadCodec? codec}) { + if (payload == null) { + throw StateError("Task '$id' does not have a payload."); + } + return payloadValue(codec: codec) as T; + } + /// The error that occurred during task execution, if any. final TaskError? error; /// Additional metadata for this task status. final Map meta; + /// Decodes the full task metadata payload with [codec]. + T metaAs({required PayloadCodec codec}) { + return codec.decode(meta); + } + + /// Decodes the full task metadata payload with a JSON decoder. + T metaJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(meta); + } + + /// Decodes the full task metadata payload with a version-aware JSON decoder. + T metaVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(meta); + } + /// The attempt number for this task execution. final int attempt; @@ -554,6 +654,39 @@ class TaskError { /// Additional metadata for this error. final Map meta; + /// Decodes the full error metadata payload as a typed DTO with [codec]. + T metaAs({required PayloadCodec codec}) { + return codec.decode(meta); + } + + /// Decodes the full error metadata payload as a typed DTO with a JSON + /// decoder. + T metaJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(meta); + } + + /// Decodes the full error metadata payload as a typed DTO with a + /// version-aware JSON decoder. + T metaVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(meta); + } + /// Serializes the error metadata to JSON. Map toJson() => { 'type': type, @@ -626,6 +759,38 @@ class DeadLetterEntry { /// Additional metadata captured at failure time. final Map meta; + /// Decodes the full metadata payload as a typed DTO with [codec]. + T metaAs({required PayloadCodec codec}) { + return codec.decode(meta); + } + + /// Decodes the full metadata payload as a typed DTO with a JSON decoder. + T metaJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(meta); + } + + /// Decodes the full metadata payload as a typed DTO with a version-aware + /// JSON decoder. + T metaVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(meta); + } + /// Timestamp when the task was dead-lettered. final DateTime deadAt; @@ -819,9 +984,73 @@ class ScheduleEntry { /// Positional arguments to pass to the task. final Map args; + /// Decodes the full args payload as a typed DTO with [codec]. + T argsAs({required PayloadCodec codec}) { + return codec.decode(args); + } + + /// Decodes the full args payload as a typed DTO with a JSON decoder. + T argsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(args); + } + + /// Decodes the full args payload as a typed DTO with a version-aware JSON + /// decoder. + T argsVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(args); + } + /// Keyword-style arguments passed to the task. final Map kwargs; + /// Decodes the full kwargs payload as a typed DTO with [codec]. + T kwargsAs({required PayloadCodec codec}) { + return codec.decode(kwargs); + } + + /// Decodes the full kwargs payload as a typed DTO with a JSON decoder. + T kwargsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(kwargs); + } + + /// Decodes the full kwargs payload as a typed DTO with a version-aware JSON + /// decoder. + T kwargsVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(kwargs); + } + /// Whether this schedule entry is enabled. final bool enabled; @@ -867,6 +1096,38 @@ class ScheduleEntry { /// Additional metadata for this schedule entry. final Map meta; + /// Decodes the full metadata payload as a typed DTO with [codec]. + T metaAs({required PayloadCodec codec}) { + return codec.decode(meta); + } + + /// Decodes the full metadata payload as a typed DTO with a JSON decoder. + T metaJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(meta); + } + + /// Decodes the full metadata payload as a typed DTO with a version-aware + /// JSON decoder. + T metaVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(meta); + } + /// Optimistic locking version assigned by the underlying store. final int version; @@ -1644,6 +1905,22 @@ abstract class TaskEnqueuer { Map args, Map headers, TaskOptions options, + DateTime? notBefore, + Map meta, + TaskEnqueueOptions? enqueueOptions, + }); + + /// Enqueue a dynamic-name task using a typed value plus optional [codec]. + /// + /// When [codec] is omitted, [value] must already be a string-keyed durable + /// map payload. + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers, + TaskOptions options, + DateTime? notBefore, Map meta, TaskEnqueueOptions? enqueueOptions, }); @@ -1655,6 +1932,35 @@ abstract class TaskEnqueuer { }); } +Map _encodeEnqueuedValue( + String taskName, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'Task payload for $taskName must use string keys, got ' + '${key.runtimeType}.', + ); + } + normalized[key] = entry.value; + } + return normalized; + } + throw StateError( + 'Task payload for $taskName must encode to Map, got ' + '${payload.runtimeType}.', + ); +} + /// Provides ambient metadata for task enqueue operations. /// /// Use [run] to scope workflow or tracing metadata so `Stem.enqueue` can @@ -1680,48 +1986,407 @@ class TaskEnqueueScope { } } +/// Shared input surface for task execution contexts that retain invocation +/// args. +abstract interface class TaskInputContext { + /// Arguments supplied to the current task invocation. + Map get args; +} + +/// Typed read helpers for task invocation args. +extension TaskInputContextArgs on TaskInputContext { + /// Decodes the full task-argument payload through [codec]. + T argsAs({required PayloadCodec codec}) { + return codec.decode(args); + } + + /// Decodes the full task-argument payload as a DTO. + T argsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(args); + } + + /// Decodes the full task-argument payload as a version-aware DTO. + T argsVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(args); + } + + /// Returns the decoded task arg for [key], or `null`. + T? arg(String key, {PayloadCodec? codec}) { + return args.value(key, codec: codec); + } + + /// Returns the decoded task arg for [key], or [fallback]. + T argOr(String key, T fallback, {PayloadCodec? codec}) { + return args.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded task arg for [key], throwing when absent. + T requiredArg(String key, {PayloadCodec? codec}) { + return args.requiredValue(key, codec: codec); + } + + /// Returns the decoded task arg DTO for [key], or `null`. + T? argJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return args.valueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded task arg DTO for [key], or [fallback]. + T argJsonOr( + String key, + T fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return args.valueJsonOr( + key, + fallback, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded task arg DTO for [key], throwing when absent. + T requiredArgJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return args.requiredValueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware task arg DTO for [key], or `null`. + T? argVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return args.valueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware task arg DTO for [key], or [fallback]. + T argVersionedJsonOr( + String key, + T fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return args.valueVersionedJsonOr( + key, + fallback, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware task arg DTO for [key], throwing when + /// absent. + T requiredArgVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return args.requiredValueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded task arg DTO list for [key], or `null`. + List? argListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return args.valueListJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded task arg DTO list for [key], or [fallback]. + List argListJsonOr( + String key, + List fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return args.valueListJsonOr( + key, + fallback, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded task arg DTO list for [key], throwing when absent. + List requiredArgListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return args.requiredValueListJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware task arg DTO list for [key], or `null`. + List? argListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return args.valueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware task arg DTO list for [key], or + /// [fallback]. + List argListVersionedJsonOr( + String key, + List fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return args.valueListVersionedJsonOr( + key, + fallback, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware task arg DTO list for [key], throwing + /// when absent. + List requiredArgListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return args.requiredValueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } +} + +/// Shared execution surface for task handlers and isolate entrypoints. +abstract interface class TaskExecutionContext + implements + TaskEnqueuer, + WorkflowCaller, + WorkflowEventEmitter, + TaskInputContext { + /// The unique identifier of the task. + String get id; + + /// The current attempt number. + int get attempt; + + /// Headers associated with the task. + Map get headers; + + /// Metadata for the task invocation. + Map get meta; + + /// Notify the worker that the task is still running. + void heartbeat(); + + /// Request an extension of the current lease by [by]. + Future extendLease(Duration by); + + /// Report progress back to the worker. + Future progress(double percentComplete, {Map? data}); + + /// Request a retry of the current task. + Future retry({ + Duration? countdown, + DateTime? eta, + TaskRetryPolicy? retryPolicy, + int? maxRetries, + Duration? timeLimit, + Duration? softTimeLimit, + }); + + /// Alias for [enqueue] when spawning follow-up work from the current task. + Future spawn( + String name, { + Map args, + Map headers, + TaskOptions options, + DateTime? notBefore, + Map meta, + TaskEnqueueOptions? enqueueOptions, + }); +} + +/// Shared task-progress helpers for execution contexts. +extension TaskExecutionContextProgressX on TaskExecutionContext { + /// Report progress with a JSON-serializable DTO payload. + Future progressJson( + double percentComplete, + T value, { + String? typeName, + }) { + return progress( + percentComplete, + data: Map.from( + PayloadCodec.encodeJsonMap(value, typeName: typeName), + ), + ); + } + + /// Report progress with a versioned JSON-serializable DTO payload. + Future progressVersionedJson( + double percentComplete, + T value, { + required int version, + String? typeName, + }) { + return progress( + percentComplete, + data: Map.from( + PayloadCodec.encodeVersionedJsonMap( + value, + version: version, + typeName: typeName, + ), + ), + ); + } +} + /// Context passed to handler implementations during execution. -class TaskContext implements TaskEnqueuer { +class TaskContext implements TaskExecutionContext { /// Creates a task execution context for a handler invocation. TaskContext({ required this.id, required this.attempt, required this.headers, required this.meta, - required this.heartbeat, - required this.extendLease, - required this.progress, + required void Function() heartbeat, + required Future Function(Duration) extendLease, + required Future Function( + double percentComplete, { + Map? data, + }) + progress, + this.args = const {}, this.enqueuer, - }); + this.workflows, + this.workflowEvents, + }) : _heartbeat = heartbeat, + _extendLease = extendLease, + _progress = progress; /// The unique identifier of the task. + @override final String id; + @override + final Map args; + /// The current attempt number. + @override final int attempt; /// Headers associated with the task. + @override final Map headers; /// Metadata for the task. + @override final Map meta; - - /// Function to send a heartbeat. - final void Function() heartbeat; - - /// Function to extend the lease by a given duration. - final Future Function(Duration) extendLease; - - /// Function to report progress. + final void Function() _heartbeat; + final Future Function(Duration) _extendLease; final Future Function( double percentComplete, { Map? data, }) - progress; + _progress; /// Optional enqueuer for scheduling additional tasks. final TaskEnqueuer? enqueuer; + /// Optional workflow caller for starting child workflows. + final WorkflowCaller? workflows; + + /// Optional workflow event emitter for resuming waiting workflows. + final WorkflowEventEmitter? workflowEvents; + + @override + void heartbeat() => _heartbeat(); + + @override + Future extendLease(Duration by) => _extendLease(by); + + @override + Future progress(double percentComplete, {Map? data}) => + _progress(percentComplete, data: data); + /// Enqueue a task with default context propagation. /// /// Headers and metadata from this context are merged into the enqueue @@ -1734,6 +2399,7 @@ class TaskContext implements TaskEnqueuer { Map headers = const {}, Map meta = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, TaskEnqueueOptions? enqueueOptions, }) async { final delegate = enqueuer; @@ -1761,11 +2427,34 @@ class TaskContext implements TaskEnqueuer { args: args, headers: mergedHeaders, options: options, + notBefore: notBefore, meta: mergedMeta, enqueueOptions: enqueueOptions, ); } + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeEnqueuedValue(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + /// Enqueue a typed call with default context propagation. /// /// This merges headers/meta from the task call and applies lineage metadata @@ -1796,9 +2485,13 @@ class TaskContext implements TaskEnqueuer { mergedMeta.putIfAbsent('stem.rootTaskId', () => id); } - final mergedCall = call.copyWith( + final mergedCall = call.definition.buildCall( + call.args, headers: Map.unmodifiable(mergedHeaders), + options: call.options, + notBefore: call.notBefore, meta: Map.unmodifiable(mergedMeta), + enqueueOptions: call.enqueueOptions, ); return delegate.enqueueCall( @@ -1807,13 +2500,89 @@ class TaskContext implements TaskEnqueuer { ); } + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + final delegate = workflows; + if (delegate == null) { + throw StateError('TaskContext has no workflow caller configured'); + } + return delegate.startWorkflowRef( + definition, + params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) { + final delegate = workflows; + if (delegate == null) { + throw StateError('TaskContext has no workflow caller configured'); + } + return delegate.startWorkflowCall(call); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + final delegate = workflows; + if (delegate == null) { + throw StateError('TaskContext has no workflow caller configured'); + } + return delegate.waitForWorkflowRef( + runId, + definition, + pollInterval: pollInterval, + timeout: timeout, + ); + } + + @override + Future emitValue( + String topic, + T value, { + PayloadCodec? codec, + }) { + final delegate = workflowEvents; + if (delegate == null) { + throw StateError('TaskContext has no workflow event emitter configured'); + } + return delegate.emitValue(topic, value, codec: codec); + } + + @override + Future emitEvent(WorkflowEventRef event, T value) { + final delegate = workflowEvents; + if (delegate == null) { + throw StateError('TaskContext has no workflow event emitter configured'); + } + return delegate.emitEvent(event, value); + } + /// Alias for [enqueue]. + @override Future spawn( String name, { Map args = const {}, Map headers = const {}, Map meta = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, TaskEnqueueOptions? enqueueOptions, }) { return enqueue( @@ -1822,6 +2591,7 @@ class TaskContext implements TaskEnqueuer { headers: headers, meta: meta, options: options, + notBefore: notBefore, enqueueOptions: enqueueOptions, ); } @@ -1831,6 +2601,7 @@ class TaskContext implements TaskEnqueuer { /// Throws a [TaskRetryRequest] which is intercepted by the worker to /// schedule the retry. Override retry policies/time limits per invocation /// by passing the optional parameters. + @override Future retry({ Duration? countdown, DateTime? eta, @@ -1922,10 +2693,6 @@ class InMemoryTaskRegistry implements TaskRegistry { Stream get onRegister => _registerController.stream; } -/// Backwards-compatible alias for the default in-memory registry. -@Deprecated('Use InMemoryTaskRegistry instead.') -typedef SimpleTaskRegistry = InMemoryTaskRegistry; - /// Optional task metadata for documentation and tooling. class TaskMetadata { /// Creates task metadata for documentation and tooling. @@ -2000,6 +2767,364 @@ class TaskDefinition { }) : _encodeArgs = encodeArgs, _encodeMeta = encodeMeta; + /// Creates a typed task definition backed by payload codecs. + factory TaskDefinition.codec({ + required String name, + required PayloadCodec argsCodec, + TaskMetaBuilder? encodeMeta, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + PayloadCodec? resultCodec, + }) { + return TaskDefinition( + name: name, + encodeArgs: (args) => _encodeCodecArgs(name, argsCodec, args), + encodeMeta: encodeMeta, + defaultOptions: defaultOptions, + metadata: _metadataWithResultCodec(name, metadata, resultCodec), + decodeResult: resultCodec?.decode, + ); + } + + /// Creates a typed task definition for DTO args that already expose + /// `toJson()`. + factory TaskDefinition.json({ + required String name, + TaskMetaBuilder? encodeMeta, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + TResult Function(Map payload)? decodeResultJson, + TResult Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? argsTypeName, + String? resultTypeName, + }) { + assert( + decodeResultJson == null || decodeResultVersionedJson == null, + 'Specify either decodeResultJson or decodeResultVersionedJson, not both.', + ); + final resultCodec = + decodeResultVersionedJson != null + ? PayloadCodec.versionedJson( + version: defaultDecodeVersion ?? 1, + decode: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ) + : (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$TResult', + )); + return TaskDefinition( + name: name, + encodeArgs: (args) => _encodeJsonArgs(args, argsTypeName ?? '$TArgs'), + encodeMeta: encodeMeta, + defaultOptions: defaultOptions, + metadata: _metadataWithResultCodec(name, metadata, resultCodec), + decodeResult: resultCodec?.decode, + ); + } + + /// Creates a typed task definition for DTO args that already expose + /// `toJson()` and persist a schema [version] beside the payload. + factory TaskDefinition.versionedJson({ + required String name, + required int version, + TaskMetaBuilder? encodeMeta, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + TResult Function(Map payload)? decodeResultJson, + TResult Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? argsTypeName, + String? resultTypeName, + }) { + assert( + decodeResultJson == null || decodeResultVersionedJson == null, + 'Specify either decodeResultJson or decodeResultVersionedJson, not both.', + ); + final resultCodec = + decodeResultVersionedJson != null + ? PayloadCodec.versionedJson( + version: version, + decode: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ) + : (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$TResult', + )); + return TaskDefinition( + name: name, + encodeArgs: (args) => _encodeVersionedJsonArgs( + args, + version: version, + typeName: argsTypeName ?? '$TArgs', + ), + encodeMeta: encodeMeta, + defaultOptions: defaultOptions, + metadata: _metadataWithResultCodec(name, metadata, resultCodec), + decodeResult: resultCodec?.decode, + ); + } + + /// Creates a typed task definition for DTO args that already expose + /// `toJson()` and decode versioned results through a reusable registry. + factory TaskDefinition.versionedJsonRegistry({ + required String name, + required int version, + required PayloadVersionRegistry resultRegistry, + TaskMetaBuilder? encodeMeta, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + int? defaultDecodeVersion, + String? argsTypeName, + String? resultTypeName, + }) { + return TaskDefinition( + name: name, + encodeArgs: (args) => _encodeVersionedJsonArgs( + args, + version: version, + typeName: argsTypeName ?? '$TArgs', + ), + encodeMeta: encodeMeta, + defaultOptions: defaultOptions, + metadata: _metadataWithResultCodec( + name, + metadata, + PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ), + ), + decodeResult: PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ).decode, + ); + } + + /// Creates a typed task definition for custom map args that persist a schema + /// [version] beside the payload. + factory TaskDefinition.versionedMap({ + required String name, + required Object? Function(TArgs args) encodeArgs, + required int version, + TaskMetaBuilder? encodeMeta, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + TResult Function(Map payload)? decodeResultJson, + TResult Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? argsTypeName, + String? resultTypeName, + }) { + assert( + decodeResultJson == null || decodeResultVersionedJson == null, + 'Specify either decodeResultJson or decodeResultVersionedJson, not both.', + ); + final argsCodec = PayloadCodec.versionedMap( + encode: encodeArgs, + version: version, + decode: (payload, _) => throw UnsupportedError( + 'TaskDefinition.versionedMap($name) only uses the args codec for ' + 'encoding. Decoding is not supported at the definition layer.', + ), + defaultDecodeVersion: defaultDecodeVersion, + typeName: argsTypeName ?? '$TArgs', + ); + final resultCodec = + decodeResultVersionedJson != null + ? PayloadCodec.versionedJson( + version: version, + decode: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ) + : (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$TResult', + )); + return TaskDefinition.codec( + name: name, + argsCodec: argsCodec, + encodeMeta: encodeMeta, + defaultOptions: defaultOptions, + metadata: metadata, + resultCodec: resultCodec, + ); + } + + /// Creates a typed task definition for custom map args that persist a schema + /// [version] and decode versioned results through a reusable registry. + factory TaskDefinition.versionedMapRegistry({ + required String name, + required Object? Function(TArgs args) encodeArgs, + required int version, + required PayloadVersionRegistry resultRegistry, + TaskMetaBuilder? encodeMeta, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + int? defaultDecodeVersion, + String? argsTypeName, + String? resultTypeName, + }) { + final argsCodec = PayloadCodec.versionedMap( + encode: encodeArgs, + version: version, + decode: (payload, _) => throw UnsupportedError( + 'TaskDefinition.versionedMapRegistry($name) only uses the args codec ' + 'for encoding. Decoding is not supported at the definition layer.', + ), + defaultDecodeVersion: defaultDecodeVersion, + typeName: argsTypeName ?? '$TArgs', + ); + final resultCodec = PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ); + return TaskDefinition.codec( + name: name, + argsCodec: argsCodec, + encodeMeta: encodeMeta, + defaultOptions: defaultOptions, + metadata: metadata, + resultCodec: resultCodec, + ); + } + + /// Creates a typed task definition for handlers with no producer args. + static NoArgsTaskDefinition noArgsCodec({ + required String name, + required PayloadCodec resultCodec, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + }) { + return noArgs( + name: name, + defaultOptions: defaultOptions, + metadata: metadata, + resultCodec: resultCodec, + ); + } + + /// Creates a typed task definition for handlers with no producer args. + static NoArgsTaskDefinition noArgsJson({ + required String name, + required TResult Function(Map payload) decodeResult, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + String? resultTypeName, + }) { + return noArgs( + name: name, + defaultOptions: defaultOptions, + metadata: metadata, + decodeResultJson: decodeResult, + resultTypeName: resultTypeName, + ); + } + + /// Creates a typed task definition for handlers with no producer args whose + /// result is a versioned DTO-backed JSON value. + static NoArgsTaskDefinition noArgsVersionedJson({ + required String name, + required int version, + required TResult Function(Map payload, int version) + decodeResult, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return noArgs( + name: name, + defaultOptions: defaultOptions, + metadata: metadata, + resultCodec: PayloadCodec.versionedJson( + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ), + ); + } + + /// Creates a typed task definition for handlers with no producer args whose + /// result uses a reusable version registry. + static NoArgsTaskDefinition noArgsVersionedJsonRegistry({ + required String name, + required int version, + required PayloadVersionRegistry resultRegistry, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return noArgs( + name: name, + defaultOptions: defaultOptions, + metadata: metadata, + resultCodec: PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ), + ); + } + + /// Creates a typed task definition for handlers with no producer args. + static NoArgsTaskDefinition noArgs({ + required String name, + TaskOptions defaultOptions = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + TaskResultDecoder? decodeResult, + PayloadCodec? resultCodec, + TResult Function(Map payload)? decodeResultJson, + String? resultTypeName, + }) { + assert( + resultCodec == null || decodeResultJson == null, + 'Specify either resultCodec or decodeResultJson, not both.', + ); + final resolvedResultCodec = + resultCodec ?? + (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$TResult', + )); + return NoArgsTaskDefinition( + name: name, + defaultOptions: defaultOptions, + metadata: TaskDefinition._metadataWithResultCodec( + name, + metadata, + resolvedResultCodec, + ), + decodeResult: decodeResult ?? resolvedResultCodec?.decode, + ); + } + /// The logical task name registered in the registry. final String name; @@ -2015,8 +3140,60 @@ class TaskDefinition { final TaskArgsEncoder _encodeArgs; final TaskMetaBuilder? _encodeMeta; - /// Build a typed call which can be passed to `Stem.enqueueCall`. - TaskCall call( + static Map _encodeCodecArgs( + String taskName, + PayloadCodec codec, + T args, + ) { + return _encodeEnqueuedValue(taskName, args, codec: codec); + } + + static Map _encodeJsonArgs(T args, String typeName) { + final payload = PayloadCodec.encodeJsonMap( + args, + typeName: typeName, + ); + return Map.from(payload); + } + + static Map _encodeVersionedJsonArgs( + T args, { + required int version, + required String typeName, + }) { + final payload = PayloadCodec.encodeVersionedJsonMap( + args, + version: version, + typeName: typeName, + ); + return Map.from(payload); + } + + static TaskMetadata _metadataWithResultCodec( + String taskName, + TaskMetadata metadata, + PayloadCodec? resultCodec, + ) { + if (resultCodec == null) { + return metadata; + } + return TaskMetadata( + description: metadata.description, + tags: metadata.tags, + idempotent: metadata.idempotent, + attributes: metadata.attributes, + argsEncoder: metadata.argsEncoder, + resultEncoder: + metadata.resultEncoder ?? + CodecTaskPayloadEncoder( + idValue: '$taskName.result.codec', + codec: resultCodec, + ), + ); + } + + /// Builds an explicit [TaskCall] from this definition and [args]. + TaskCall buildCall( TArgs args, { Map headers = const {}, TaskOptions? options, @@ -2058,6 +3235,42 @@ class TaskDefinition { } } +/// Typed producer-facing definition for tasks that take no input args. +class NoArgsTaskDefinition { + /// Creates a typed task definition for handlers with no producer args. + const NoArgsTaskDefinition({ + required this.name, + this.defaultOptions = const TaskOptions(), + this.metadata = const TaskMetadata(), + this.decodeResult, + }); + + /// The logical task name registered in the registry. + final String name; + + /// Default options applied to every call unless overridden. + final TaskOptions defaultOptions; + + /// Metadata associated with this task for documentation/tooling. + final TaskMetadata metadata; + + /// Optional decoder for converting persisted payloads into a typed result. + final TaskResultDecoder? decodeResult; + + /// The underlying task definition for generic enqueue/wait surfaces. + TaskDefinition<(), TResult> get asDefinition => TaskDefinition<(), TResult>( + name: name, + encodeArgs: (_) => const {}, + defaultOptions: defaultOptions, + metadata: metadata, + decodeResult: decodeResult, + ); + + /// Decodes a persisted payload into a typed result. + TResult? decode(Object? payload) => asDefinition.decode(payload); + +} + /// Represents a pending enqueue operation built from a [TaskDefinition]. class TaskCall { const TaskCall._({ @@ -2100,132 +3313,68 @@ class TaskCall { /// Resolve final options combining call overrides with defaults. TaskOptions resolveOptions() => options ?? definition.defaultOptions; - /// Returns a copy of this call with updated properties. - TaskCall copyWith({ - Map? headers, - TaskOptions? options, +} + +/// Convenience helpers for building typed enqueue requests directly from a task +/// enqueuer. +extension TaskEnqueuerBuilderExtension on TaskEnqueuer { + /// Enqueues a name-based task from a DTO that already exposes `toJson()`. + Future enqueueJson( + String name, + T argsJson, { + Map headers = const {}, + TaskOptions options = const TaskOptions(), DateTime? notBefore, - Map? meta, + Map meta = const {}, TaskEnqueueOptions? enqueueOptions, + String? typeName, }) { - return TaskCall._( - definition: definition, - args: args, - headers: headers ?? this.headers, - options: options ?? this.options, - notBefore: notBefore ?? this.notBefore, - meta: meta ?? this.meta, - enqueueOptions: enqueueOptions ?? this.enqueueOptions, + return enqueue( + name, + args: Map.from( + PayloadCodec.encodeJsonMap( + argsJson, + typeName: typeName ?? '$T', + ), + ), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, ); } -} - -/// Fluent builder used to construct rich enqueue requests. -/// -/// Build a [TaskCall] and dispatch it via `TaskEnqueuer.enqueueCall`. -class TaskEnqueueBuilder { - /// Creates a fluent builder for enqueue calls. - TaskEnqueueBuilder({required this.definition, required this.args}); - - /// Task definition used to construct the call. - final TaskDefinition definition; - - /// Typed arguments for the task invocation. - final TArgs args; - - Map? _headers; - TaskOptions? _optionsOverride; - DateTime? _notBefore; - Map? _meta; - TaskEnqueueOptions? _enqueueOptions; - - /// Replaces headers entirely. - TaskEnqueueBuilder headers(Map headers) { - _headers = Map.from(headers); - return this; - } - - /// Adds or overrides a single header entry. - TaskEnqueueBuilder header(String key, String value) { - final current = Map.from(_headers ?? const {}); - current[key] = value; - _headers = current; - return this; - } - - /// Replaces metadata entirely. - TaskEnqueueBuilder metadata(Map meta) { - _meta = Map.from(meta); - return this; - } - - /// Adds or overrides a metadata entry. - TaskEnqueueBuilder meta(String key, Object? value) { - final current = Map.from(_meta ?? const {}); - current[key] = value; - _meta = current; - return this; - } - - /// Replaces the options for this call. - TaskEnqueueBuilder options(TaskOptions options) { - _optionsOverride = options; - return this; - } - - /// Sets the queue for this enqueue. - TaskEnqueueBuilder queue(String queue) { - final base = _optionsOverride ?? definition.defaultOptions; - _optionsOverride = base.copyWith(queue: queue); - return this; - } - /// Sets the priority for this enqueue. - TaskEnqueueBuilder priority(int priority) { - final base = _optionsOverride ?? definition.defaultOptions; - _optionsOverride = base.copyWith(priority: priority); - return this; - } - - /// Sets the earliest execution time. - TaskEnqueueBuilder notBefore(DateTime instant) { - _notBefore = instant; - return this; - } - - /// Sets a relative delay before execution. - TaskEnqueueBuilder delay(Duration duration) { - _notBefore = stemNow().add(duration); - return this; - } - - /// Replaces the enqueue options for this call. - TaskEnqueueBuilder enqueueOptions( - TaskEnqueueOptions options, - ) { - _enqueueOptions = options; - return this; - } - - /// Builds the [TaskCall] with accumulated overrides. - TaskCall build() { - final base = definition(args); - final mergedHeaders = Map.from(base.headers); - if (_headers != null) { - mergedHeaders.addAll(_headers!); - } - final mergedMeta = Map.from(base.meta); - if (_meta != null) { - mergedMeta.addAll(_meta!); - } - return base.copyWith( - headers: Map.unmodifiable(mergedHeaders), - options: _optionsOverride ?? base.options, - notBefore: _notBefore ?? base.notBefore, - meta: Map.unmodifiable(mergedMeta), - enqueueOptions: _enqueueOptions ?? base.enqueueOptions, + /// Enqueues a name-based task from a DTO and persists a schema [version] + /// beside the JSON payload. + Future enqueueVersionedJson( + String name, + T argsJson, { + required int version, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + String? typeName, + }) { + return enqueue( + name, + args: Map.from( + PayloadCodec.encodeVersionedJsonMap( + argsJson, + version: version, + typeName: typeName ?? '$T', + ), + ), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, ); } + } /// Retry strategy used to compute the next backoff delay. @@ -2388,6 +3537,58 @@ class GroupStatus { /// Additional metadata for the group. final Map meta; + /// Returns the decoded payload value for each collected child result. + /// + /// When [codec] is supplied, each stored durable payload is decoded through + /// that codec before being returned. + Map resultValues({PayloadCodec? codec}) { + return Map.unmodifiable({ + for (final entry in results.entries) + entry.key: entry.value.payloadValue(codec: codec), + }); + } + + /// Decodes each collected child result as a typed DTO with [codec]. + Map resultAs({required PayloadCodec codec}) { + return Map.unmodifiable({ + for (final entry in results.entries) + entry.key: entry.value.payloadAs(codec: codec), + }); + } + + /// Decodes each collected child result as a typed DTO with a JSON decoder. + Map resultJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return Map.unmodifiable({ + for (final entry in results.entries) + entry.key: entry.value.payloadJson( + decode: decode, + typeName: typeName, + ), + }); + } + + /// Decodes each collected child result as a typed DTO with a version-aware + /// JSON decoder. + Map resultVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return Map.unmodifiable({ + for (final entry in results.entries) + entry.key: entry.value.payloadVersionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ), + }); + } + /// The number of completed results. int get completed => results.length; diff --git a/packages/stem/lib/src/core/envelope.dart b/packages/stem/lib/src/core/envelope.dart index 053396a3..dd12dd4d 100644 --- a/packages/stem/lib/src/core/envelope.dart +++ b/packages/stem/lib/src/core/envelope.dart @@ -33,6 +33,7 @@ library; import 'dart:convert'; import 'package:stem/src/core/clock.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:uuid/uuid.dart'; /// Target classification for routing operations. @@ -227,6 +228,38 @@ class Envelope { /// Arguments passed to the task handler. final Map args; + /// Decodes the full task args payload as a typed DTO with [codec]. + T argsAs({required PayloadCodec codec}) { + return codec.decode(args); + } + + /// Decodes the full task args payload as a typed DTO from JSON. + T argsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(args); + } + + /// Decodes the full task args payload as a typed DTO from version-aware + /// JSON. + T argsVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(args); + } + /// Arbitrary metadata headers (trace id, tenant, etc). final Map headers; @@ -254,6 +287,38 @@ class Envelope { /// Additional metadata persisted with the message. final Map meta; + /// Decodes the full envelope metadata payload as a typed DTO with [codec]. + T metaAs({required PayloadCodec codec}) { + return codec.decode(meta); + } + + /// Decodes the full envelope metadata payload as a typed DTO from JSON. + T metaJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(meta); + } + + /// Decodes the full envelope metadata payload as a typed DTO from + /// version-aware JSON. + T metaVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(meta); + } + /// Returns a copy of this envelope with updated fields. Envelope copyWith({ String? id, diff --git a/packages/stem/lib/src/core/function_task_handler.dart b/packages/stem/lib/src/core/function_task_handler.dart index 059b68ee..d47c90d0 100644 --- a/packages/stem/lib/src/core/function_task_handler.dart +++ b/packages/stem/lib/src/core/function_task_handler.dart @@ -54,6 +54,7 @@ class FunctionTaskHandler implements TaskHandler { Future call(TaskContext context, Map args) async { final invocationContext = TaskInvocationContext.local( id: context.id, + args: args, headers: context.headers, meta: context.meta, attempt: context.attempt, @@ -61,6 +62,8 @@ class FunctionTaskHandler implements TaskHandler { extendLease: context.extendLease, progress: (percent, {data}) => context.progress(percent, data: data), enqueuer: context.enqueuer, + workflows: context.workflows, + workflowEvents: context.workflowEvents, ); final result = await _entrypoint(invocationContext, args); return result as R; diff --git a/packages/stem/lib/src/core/payload_codec.dart b/packages/stem/lib/src/core/payload_codec.dart index 813f6d64..2a3b91b9 100644 --- a/packages/stem/lib/src/core/payload_codec.dart +++ b/packages/stem/lib/src/core/payload_codec.dart @@ -1,18 +1,293 @@ +import 'dart:collection'; + import 'package:stem/src/core/task_payload_encoder.dart'; +/// Registry of version-specific payload decoders for a single durable DTO type. +/// +/// Use this when a payload schema evolves over time and you want one reusable +/// place to define how each stored version should be decoded. +class PayloadVersionRegistry { + /// Creates a version registry from explicit [decoders]. + const PayloadVersionRegistry({ + required Map payload)> decoders, + this.defaultVersion, + }) : _decoders = decoders; + + final Map payload)> _decoders; + + /// Fallback version to use when a stored payload does not persist one. + final int? defaultVersion; + + /// Registered decoder versions. + Map payload)> get decoders => + UnmodifiableMapView(_decoders); + + /// Decodes [payload] using the decoder registered for [version]. + T decode( + Map payload, + int version, { + String typeName = 'payload', + }) { + final decoder = _decoders[version]; + if (decoder == null) { + final known = _decoders.keys.toList()..sort(); + throw StateError( + '$typeName has no decoder registered for payload version $version. ' + 'Known versions: ${known.join(', ')}.', + ); + } + return decoder(payload); + } +} + + /// Encodes and decodes a strongly-typed payload value. /// /// This author-facing codec layer is used by generated workflow/task helpers to /// lower richer Dart DTOs into the existing durable wire format. class PayloadCodec { /// Creates a payload codec from explicit encode/decode callbacks. - const PayloadCodec({required this.encode, required this.decode}); + const PayloadCodec({ + required Object? Function(T value) encode, + required T Function(Object? payload) decode, + }) : _encode = encode, + _decode = decode, + _decodeMap = null, + _decodeVersionedMap = null, + _jsonVersion = null, + _defaultDecodeVersion = null, + _typeName = null; + + /// Creates a payload codec for DTOs that serialize to a durable map payload. + /// + /// Use this when you need a custom map encoder or a decode function that is + /// not the usual `Type.fromJson(...)` shape: + /// + /// ```dart + /// const approvalCodec = PayloadCodec.map( + /// encode: (value) => value.toJson(), + /// decode: Approval.fromJson, + /// ); + /// ``` + const PayloadCodec.map({ + required Object? Function(T value) encode, + required T Function(Map payload) decode, + String? typeName, + }) : _encode = encode, + _decode = null, + _decodeMap = decode, + _decodeVersionedMap = null, + _jsonVersion = null, + _defaultDecodeVersion = null, + _typeName = typeName; + + /// Creates a payload codec for map-backed DTO payloads that also persist a + /// schema [version]. + /// + /// Use this when a payload shape is expected to evolve over time and the + /// decoder needs the stored schema version, but the payload still uses a + /// custom map encoder or a nonstandard decode shape: + /// + /// ```dart + /// const approvalCodec = PayloadCodec.versionedMap( + /// encode: (value) => {'legacy_status': value.status}, + /// version: 2, + /// defaultDecodeVersion: 1, + /// decode: Approval.fromVersionedMap, + /// ); + /// ``` + const PayloadCodec.versionedMap({ + required Object? Function(T value) encode, + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) : _encode = encode, + _decode = null, + _decodeMap = null, + _decodeVersionedMap = decode, + _jsonVersion = version, + _defaultDecodeVersion = defaultDecodeVersion, + _typeName = typeName; + + /// Creates a payload codec for DTOs that expose `toJson()` and a matching + /// typed decoder like `Type.fromJson(...)`. + /// + /// This is the shortest happy path for common DTO payloads: + /// + /// ```dart + /// const approvalCodec = PayloadCodec.json( + /// decode: Approval.fromJson, + /// ); + /// ``` + const PayloadCodec.json({ + required T Function(Map payload) decode, + String? typeName, + }) : _encode = _encodeJsonPayload, + _decode = null, + _decodeMap = decode, + _decodeVersionedMap = null, + _jsonVersion = null, + _defaultDecodeVersion = null, + _typeName = typeName; + + /// Creates a JSON DTO codec that also persists a schema version. + /// + /// Use this when a payload shape is expected to evolve over time and the + /// decoder needs to know which persisted schema version it is reading. + /// + /// ```dart + /// const approvalCodec = PayloadCodec.versionedJson( + /// version: 2, + /// defaultDecodeVersion: 1, + /// decode: Approval.fromVersionedJson, + /// ); + /// ``` + const PayloadCodec.versionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) : _encode = _encodeJsonPayload, + _decode = null, + _decodeMap = null, + _decodeVersionedMap = decode, + _jsonVersion = version, + _defaultDecodeVersion = defaultDecodeVersion, + _typeName = typeName; + + /// Creates a JSON DTO codec backed by a reusable version registry. + /// + /// This keeps payload version evolution in one place instead of repeating the + /// same `switch(version)` logic across task, workflow, and event surfaces. + factory PayloadCodec.versionedJsonRegistry({ + required int version, + required PayloadVersionRegistry registry, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + defaultDecodeVersion: defaultDecodeVersion ?? registry.defaultVersion, + decode: (payload, storedVersion) => registry.decode( + payload, + storedVersion, + typeName: typeName ?? '$T', + ), + typeName: typeName, + ); + } + + /// Creates a custom map-backed codec backed by a reusable version registry. + factory PayloadCodec.versionedMapRegistry({ + required Object? Function(T value) encode, + required int version, + required PayloadVersionRegistry registry, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedMap( + encode: encode, + version: version, + defaultDecodeVersion: defaultDecodeVersion ?? registry.defaultVersion, + decode: (payload, storedVersion) => registry.decode( + payload, + storedVersion, + typeName: typeName ?? '$T', + ), + typeName: typeName, + ); + } + + /// Reserved key used to persist payload schema versions for versioned codecs. + static const String versionKey = '__stemPayloadVersion'; + + final Object? Function(T value) _encode; + final T Function(Object? payload)? _decode; + final T Function(Map payload)? _decodeMap; + final T Function(Map payload, int version)? + _decodeVersionedMap; + final int? _jsonVersion; + final int? _defaultDecodeVersion; + final String? _typeName; + + /// Encodes a DTO to the string-keyed map shape required by task/workflow + /// argument payloads. + static Map encodeJsonMap( + T value, { + String? typeName, + }) { + final payload = _encodeJsonPayload(value); + return _payloadJsonMap(payload, typeName ?? value.runtimeType.toString()); + } + + /// Encodes a DTO to a string-keyed map and persists a schema [version] + /// alongside the payload. + static Map encodeVersionedJsonMap( + T value, { + required int version, + String? typeName, + }) { + return { + versionKey: version, + ...encodeJsonMap(value, typeName: typeName), + }; + } + + /// Normalizes a durable payload into the string-keyed JSON map shape used by + /// DTO-style decoders. + static Map decodeJsonMap( + Object? payload, { + String typeName = 'payload', + }) { + return _payloadJsonMap(payload, typeName); + } + + /// Reads the persisted schema version from a durable JSON payload. + static int readPayloadVersion( + Object? payload, { + int defaultVersion = 1, + String typeName = 'payload', + }) { + return _payloadVersion( + _payloadJsonMap(payload, typeName), + defaultVersion: defaultVersion, + typeName: typeName, + ); + } /// Converts a typed value into a durable payload representation. - final Object? Function(T value) encode; + Object? encode(T value) { + final encoded = _encode(value); + final version = _jsonVersion; + if (version == null) return encoded; + final json = _payloadJsonMap(encoded, _typeName ?? '$T'); + return { + versionKey: version, + ...json, + }; + } /// Reconstructs a typed value from a durable payload representation. - final T Function(Object? payload) decode; + T decode(Object? payload) { + final decode = _decode; + if (decode != null) { + return decode(payload); + } + final decodeVersionedMap = _decodeVersionedMap; + if (decodeVersionedMap != null) { + final json = _payloadJsonMap(payload, _typeName ?? '$T'); + final version = _payloadVersion( + json, + defaultVersion: _defaultDecodeVersion ?? _jsonVersion ?? 1, + typeName: _typeName ?? '$T', + ); + final normalized = Map.from(json)..remove(versionKey); + return decodeVersionedMap(normalized, version); + } + final decodeMap = _decodeMap!; + return decodeMap(_payloadJsonMap(payload, _typeName ?? '$T')); + } /// Converts an erased author-facing value into a durable payload. Object? encodeDynamic(Object? value) { @@ -27,6 +302,62 @@ class PayloadCodec { } } +Object? _encodeJsonPayload(T value) { + try { + final payload = (value as dynamic).toJson(); + return _payloadJsonMap(payload, value.runtimeType.toString()); + // Dynamic `toJson()` probing is the purpose of this helper. + // ignore: avoid_catching_errors + } on NoSuchMethodError { + throw StateError( + '${value.runtimeType} must expose toJson() to use PayloadCodec.json.', + ); + } +} + +Map _payloadJsonMap(Object? value, String typeName) { + if (value is Map) { + return Map.from(value); + } + if (value is Map) { + return Map.from(value); + } + if (value is Map) { + final result = {}; + for (final entry in value.entries) { + final key = entry.key; + if (key is! String) { + throw StateError('$typeName payload must use string keys.'); + } + result[key] = entry.value; + } + return result; + } + throw StateError( + '$typeName payload must decode to a string-keyed map, got ' + '${value.runtimeType}.', + ); +} + +int _payloadVersion( + Map payload, { + required int defaultVersion, + required String typeName, +}) { + final rawVersion = payload[PayloadCodec.versionKey]; + if (rawVersion == null) return defaultVersion; + if (rawVersion is int) return rawVersion; + if (rawVersion is num) return rawVersion.toInt(); + if (rawVersion is String) { + final parsed = int.tryParse(rawVersion); + if (parsed != null) return parsed; + } + throw StateError( + '$typeName payload version must be an int-compatible value, got ' + '${rawVersion.runtimeType}.', + ); +} + /// Bridges a [PayloadCodec] into the existing [TaskPayloadEncoder] contract. class CodecTaskPayloadEncoder extends TaskPayloadEncoder { /// Creates a task payload encoder backed by a typed [codec]. diff --git a/packages/stem/lib/src/core/payload_map.dart b/packages/stem/lib/src/core/payload_map.dart new file mode 100644 index 00000000..6513c259 --- /dev/null +++ b/packages/stem/lib/src/core/payload_map.dart @@ -0,0 +1,277 @@ +import 'package:stem/src/core/payload_codec.dart'; + +/// Typed read helpers for durable task-argument and workflow-parameter maps. +extension PayloadMapX on Map { + /// Returns the decoded value for [key], or `null` when the payload is absent. + /// + /// When [codec] is supplied, the stored durable payload is decoded through + /// that codec before being returned. + T? value(String key, {PayloadCodec? codec}) { + final payload = this[key]; + if (payload == null) return null; + if (codec != null) { + return codec.decode(payload); + } + return payload as T; + } + + /// Returns the decoded value for [key], or [fallback] when it is absent. + T valueOr(String key, T fallback, {PayloadCodec? codec}) { + return value(key, codec: codec) ?? fallback; + } + + /// Returns the decoded value for [key], throwing when it is missing. + T requiredValue(String key, {PayloadCodec? codec}) { + if (!containsKey(key) || this[key] == null) { + throw StateError("Missing required payload key '$key'."); + } + return value(key, codec: codec) as T; + } + + /// Decodes the value for [key] as a typed DTO with a JSON decoder. + T? valueJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = this[key]; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the value for [key] as a typed DTO with a version-aware JSON + /// decoder. + T? valueVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = this[key]; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the value for [key] as a typed DTO, or [fallback] when absent. + T valueJsonOr( + String key, + T fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return valueJson( + key, + decode: decode, + typeName: typeName, + ) ?? + fallback; + } + + /// Decodes the value for [key] as a typed DTO, throwing when absent. + T requiredValueJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + if (!containsKey(key) || this[key] == null) { + throw StateError("Missing required payload key '$key'."); + } + return valueJson( + key, + decode: decode, + typeName: typeName, + ) as T; + } + + /// Decodes the value for [key] as a version-aware typed DTO, or [fallback] + /// when absent. + T valueVersionedJsonOr( + String key, + T fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return valueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ) ?? + fallback; + } + + /// Decodes the value for [key] as a version-aware typed DTO, throwing when + /// absent. + T requiredValueVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + if (!containsKey(key) || this[key] == null) { + throw StateError("Missing required payload key '$key'."); + } + return valueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ) as T; + } + + /// Returns the decoded list value for [key], or `null` when it is absent. + /// + /// When [codec] is supplied, each stored durable payload is decoded through + /// that codec before being returned. + List? valueList(String key, {PayloadCodec? codec}) { + final payload = this[key]; + if (payload == null) return null; + final values = payload as List; + if (codec != null) { + return List.unmodifiable(values.map(codec.decode)); + } + return List.unmodifiable(values.cast()); + } + + /// Returns the decoded list value for [key], or [fallback] when it is + /// absent. + List valueListOr( + String key, + List fallback, { + PayloadCodec? codec, + }) { + return valueList(key, codec: codec) ?? fallback; + } + + /// Returns the decoded list value for [key], throwing when it is missing. + List requiredValueList(String key, {PayloadCodec? codec}) { + if (!containsKey(key) || this[key] == null) { + throw StateError("Missing required payload key '$key'."); + } + return valueList(key, codec: codec)!; + } + + /// Returns the decoded DTO list value for [key], or `null` when it is + /// absent. + List? valueListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = this[key]; + if (payload == null) return null; + final values = payload as List; + final codec = PayloadCodec.json( + decode: decode, + typeName: typeName, + ); + return List.unmodifiable(values.map(codec.decode)); + } + + /// Returns the decoded version-aware DTO list value for [key], or `null` + /// when it is absent. + List? valueListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = this[key]; + if (payload == null) return null; + final values = payload as List; + final codec = PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ); + return List.unmodifiable(values.map(codec.decode)); + } + + /// Returns the decoded DTO list value for [key], or [fallback] when absent. + List valueListJsonOr( + String key, + List fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return valueListJson( + key, + decode: decode, + typeName: typeName, + ) ?? + fallback; + } + + /// Returns the decoded DTO list value for [key], throwing when absent. + List requiredValueListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + if (!containsKey(key) || this[key] == null) { + throw StateError("Missing required payload key '$key'."); + } + return valueListJson( + key, + decode: decode, + typeName: typeName, + )!; + } + + /// Returns the decoded version-aware DTO list value for [key], or + /// [fallback] when absent. + List valueListVersionedJsonOr( + String key, + List fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return valueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ) ?? + fallback; + } + + /// Returns the decoded version-aware DTO list value for [key], throwing when + /// absent. + List requiredValueListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + if (!containsKey(key) || this[key] == null) { + throw StateError("Missing required payload key '$key'."); + } + return valueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + )!; + } +} diff --git a/packages/stem/lib/src/core/queue_events.dart b/packages/stem/lib/src/core/queue_events.dart index f40e0816..1da56dd7 100644 --- a/packages/stem/lib/src/core/queue_events.dart +++ b/packages/stem/lib/src/core/queue_events.dart @@ -3,6 +3,8 @@ import 'dart:async'; import 'package:stem/src/core/clock.dart'; import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; import 'package:stem/src/core/stem_event.dart'; const String _queueEventEnvelopeName = '__stem.queue.event__'; @@ -44,6 +46,85 @@ class QueueCustomEvent implements StemEvent { /// Additional metadata supplied by the publisher. final Map meta; + /// Decodes the full event metadata payload with [codec]. + T metaAs({required PayloadCodec codec}) { + return codec.decode(meta); + } + + /// Decodes the full event metadata payload with a JSON decoder. + T metaJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(meta); + } + + /// Decodes the full event metadata payload with a version-aware JSON + /// decoder. + T metaVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(meta); + } + + /// Returns the decoded payload value for [key], or `null` when it is absent. + T? payloadValue(String key, {PayloadCodec? codec}) { + return payload.value(key, codec: codec); + } + + /// Decodes the entire payload as a typed DTO with [codec]. + T payloadAs({required PayloadCodec codec}) { + return codec.decode(payload); + } + + /// Decodes the entire payload as a typed DTO with a JSON decoder. + T payloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the entire payload as a typed DTO with a version-aware JSON + /// decoder. + T payloadVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + + /// Returns the decoded payload value for [key], or [fallback] when absent. + T payloadValueOr(String key, T fallback, {PayloadCodec? codec}) { + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded payload value for [key], throwing when it is absent. + T requiredPayloadValue(String key, {PayloadCodec? codec}) { + return payload.requiredValue(key, codec: codec); + } + @override String get eventName => name; @@ -122,6 +203,106 @@ class QueueEventsProducer { ); return envelope.id; } + + /// Emits [eventName] using a DTO payload that exposes `toJson()`. + Future emitJson( + String queue, + String eventName, + T payloadJson, { + Map headers = const {}, + Map meta = const {}, + String? typeName, + }) { + return emit( + queue, + eventName, + payload: Map.from( + PayloadCodec.encodeJsonMap( + payloadJson, + typeName: typeName ?? '$T', + ), + ), + headers: headers, + meta: meta, + ); + } + + /// Emits [eventName] using a typed value plus optional [codec]. + /// + /// When [codec] is omitted, [value] must already be a string-keyed durable + /// map payload. + Future emitValue( + String queue, + String eventName, + T value, { + PayloadCodec? codec, + Map headers = const {}, + Map meta = const {}, + }) { + return emit( + queue, + eventName, + payload: _encodeQueueEventValue(queue, eventName, value, codec: codec), + headers: headers, + meta: meta, + ); + } + + /// Emits [eventName] using a DTO payload and stores a schema [version] + /// beside the JSON payload. + Future emitVersionedJson( + String queue, + String eventName, + T payloadJson, { + required int version, + Map headers = const {}, + Map meta = const {}, + String? typeName, + }) { + return emit( + queue, + eventName, + payload: Map.from( + PayloadCodec.encodeVersionedJsonMap( + payloadJson, + version: version, + typeName: typeName ?? '$T', + ), + ), + headers: headers, + meta: meta, + ); + } +} + +Map _encodeQueueEventValue( + String queue, + String eventName, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'Queue event payload for $queue/$eventName must use string keys, ' + 'got ${key.runtimeType}.', + ); + } + normalized[key] = entry.value; + } + return normalized; + } + throw StateError( + 'Queue event payload for $queue/$eventName must encode to ' + 'Map, got ${payload.runtimeType}.', + ); } /// Listens for queue-scoped custom events emitted by [QueueEventsProducer]. diff --git a/packages/stem/lib/src/core/stem.dart b/packages/stem/lib/src/core/stem.dart index d363135a..91fcd010 100644 --- a/packages/stem/lib/src/core/stem.dart +++ b/packages/stem/lib/src/core/stem.dart @@ -62,6 +62,7 @@ import 'package:stem/src/core/clock.dart'; import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/encoder_keys.dart'; import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/core/retry.dart'; import 'package:stem/src/core/task_payload_encoder.dart'; import 'package:stem/src/core/task_result.dart'; @@ -74,8 +75,27 @@ import 'package:stem/src/routing/routing_registry.dart'; import 'package:stem/src/security/signing.dart'; import 'package:stem/src/signals/emitter.dart'; +/// Shared typed task-dispatch surface used by producers, apps, and contexts. +abstract interface class TaskResultCaller implements TaskEnqueuer { + /// Reads the latest task status by task id. + Future getTaskStatus(String taskId); + + /// Reads the latest group status by group id. + Future getGroupStatus(String groupId); + + /// Waits for a task result by task id. + Future?> waitForTask( + String taskId, { + Duration? timeout, + TResult Function(Object? payload)? decode, + TResult Function(Map payload)? decodeJson, + TResult Function(Map payload, int version)? + decodeVersionedJson, + }); +} + /// Facade used by producer applications to enqueue tasks. -class Stem implements TaskEnqueuer { +class Stem implements TaskResultCaller { /// Creates a Stem producer facade with the provided dependencies. Stem({ required this.broker, @@ -155,21 +175,42 @@ class Stem implements TaskEnqueuer { } } - /// Enqueue a typed task using a [TaskCall] wrapper produced by a - /// [TaskDefinition]. + @override + Future getTaskStatus(String taskId) async { + final resolved = backend; + if (resolved == null) return null; + return resolved.get(taskId); + } + + @override + Future getGroupStatus(String groupId) async { + final resolved = backend; + if (resolved == null) return null; + return resolved.getGroup(groupId); + } + + /// Enqueue a typed task using an explicit [TaskCall] transport object, + /// typically produced by `TaskDefinition.buildCall(...)`. @override Future enqueueCall( TaskCall call, { TaskEnqueueOptions? enqueueOptions, }) { - return enqueue( - call.name, + final definition = call.definition; + final resolvedOptions = call.resolveOptions(); + final metadata = definition.metadata; + return _enqueueResolved( + name: call.name, args: call.encodeArgs(), headers: call.headers, - options: call.resolveOptions(), + options: resolvedOptions, + fallbackOptions: definition.defaultOptions, notBefore: call.notBefore, meta: call.meta, enqueueOptions: enqueueOptions ?? call.enqueueOptions, + metadata: metadata, + argsEncoder: _resolveArgsEncoderFromMetadata(metadata), + resultEncoder: _resolveResultEncoderFromMetadata(metadata), ); } @@ -184,22 +225,72 @@ class Stem implements TaskEnqueuer { Map meta = const {}, TaskEnqueueOptions? enqueueOptions, }) async { + final handler = registry.resolve(name); + if (handler == null) { + throw ArgumentError.value(name, 'name', 'Task is not registered'); + } + return _enqueueResolved( + name: name, + args: args, + headers: headers, + options: options, + fallbackOptions: handler.options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + metadata: handler.metadata, + argsEncoder: _resolveArgsEncoder(handler), + resultEncoder: _resolveResultEncoder(handler), + ); + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeStemTaskValue(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + + Future _enqueueResolved({ + required String name, + required Map args, + required Map headers, + required TaskOptions options, + required TaskOptions fallbackOptions, + required DateTime? notBefore, + required Map meta, + required TaskEnqueueOptions? enqueueOptions, + required TaskMetadata metadata, + required TaskPayloadEncoder argsEncoder, + required TaskPayloadEncoder resultEncoder, + }) async { + final effectiveOptions = _resolveEffectiveTaskOptions( + options, + fallbackOptions, + ); final tracer = StemTracer.instance; - final queueOverride = enqueueOptions?.queue ?? options.queue; + final queueOverride = enqueueOptions?.queue ?? effectiveOptions.queue; final decision = routing.resolve( RouteRequest(task: name, headers: headers, queue: queueOverride), ); final targetName = decision.targetName; - final basePriority = enqueueOptions?.priority ?? options.priority; + final basePriority = enqueueOptions?.priority ?? effectiveOptions.priority; final resolvedPriority = decision.effectivePriority(basePriority); - - final handler = registry.resolve(name); - if (handler == null) { - throw ArgumentError.value(name, 'name', 'Task is not registered'); - } - final metadata = handler.metadata; - final argsEncoder = _resolveArgsEncoder(handler); - final resultEncoder = _resolveResultEncoder(handler); final scopeMeta = TaskEnqueueScope.currentMeta(); final mergedMeta = scopeMeta == null ? meta @@ -214,9 +305,9 @@ class Stem implements TaskEnqueuer { if (!enrichedMeta.containsKey('stem.task')) { enrichedMeta['stem.task'] = name; } - if (options.retryPolicy != null && + if (effectiveOptions.retryPolicy != null && !enrichedMeta.containsKey('stem.retryPolicy')) { - enrichedMeta['stem.retryPolicy'] = options.retryPolicy!.toJson(); + enrichedMeta['stem.retryPolicy'] = effectiveOptions.retryPolicy!.toJson(); } final scheduledAt = _resolveNotBefore( @@ -224,8 +315,7 @@ class Stem implements TaskEnqueuer { enqueueOptions, ); final maxRetries = _resolveMaxRetries( - options, - handler.options, + effectiveOptions, enqueueOptions, ); final taskId = enqueueOptions?.taskId ?? generateEnvelopeId(); @@ -285,11 +375,11 @@ class Stem implements TaskEnqueuer { notBefore: scheduledAt, priority: resolvedPriority, maxRetries: maxRetries, - visibilityTimeout: options.visibilityTimeout, + visibilityTimeout: effectiveOptions.visibilityTimeout, meta: encodedMeta, ); - if (options.unique) { + if (effectiveOptions.unique) { final coordinator = uniqueTaskCoordinator; if (coordinator == null) { throw StateError( @@ -299,7 +389,7 @@ class Stem implements TaskEnqueuer { } final claim = await coordinator.acquire( envelope: envelope, - options: options, + options: effectiveOptions, ); if (!claim.isAcquired) { final existingId = claim.existingTaskId; @@ -397,14 +487,66 @@ class Stem implements TaskEnqueuer { ); } + TaskOptions _resolveEffectiveTaskOptions( + TaskOptions options, + TaskOptions fallbackOptions, + ) { + const defaults = TaskOptions(); + return TaskOptions( + queue: options.queue != defaults.queue + ? options.queue + : fallbackOptions.queue, + maxRetries: options.maxRetries != defaults.maxRetries + ? options.maxRetries + : fallbackOptions.maxRetries, + softTimeLimit: options.softTimeLimit ?? fallbackOptions.softTimeLimit, + hardTimeLimit: options.hardTimeLimit ?? fallbackOptions.hardTimeLimit, + rateLimit: options.rateLimit ?? fallbackOptions.rateLimit, + groupRateLimit: options.groupRateLimit ?? fallbackOptions.groupRateLimit, + groupRateKey: options.groupRateKey ?? fallbackOptions.groupRateKey, + groupRateKeyHeader: + options.groupRateKeyHeader != defaults.groupRateKeyHeader + ? options.groupRateKeyHeader + : fallbackOptions.groupRateKeyHeader, + groupRateLimiterFailureMode: + options.groupRateLimiterFailureMode != + defaults.groupRateLimiterFailureMode + ? options.groupRateLimiterFailureMode + : fallbackOptions.groupRateLimiterFailureMode, + unique: options.unique != defaults.unique + ? options.unique + : fallbackOptions.unique, + uniqueFor: options.uniqueFor ?? fallbackOptions.uniqueFor, + priority: options.priority != defaults.priority + ? options.priority + : fallbackOptions.priority, + acksLate: options.acksLate != defaults.acksLate + ? options.acksLate + : fallbackOptions.acksLate, + visibilityTimeout: + options.visibilityTimeout ?? fallbackOptions.visibilityTimeout, + retryPolicy: options.retryPolicy ?? fallbackOptions.retryPolicy, + ); + } + /// Waits for [taskId] to reach a terminal state and returns a typed view of /// the final [TaskStatus]. Requires [backend] to be configured; otherwise a /// [StateError] is thrown. + @override Future?> waitForTask( String taskId, { Duration? timeout, T Function(Object? payload)? decode, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, }) async { + assert( + [decode, decodeJson, decodeVersionedJson] + .whereType() + .length <= + 1, + 'Specify at most one of decode, decodeJson, or decodeVersionedJson.', + ); final resultBackend = backend; if (resultBackend == null) { throw StateError( @@ -417,7 +559,12 @@ class Stem implements TaskEnqueuer { taskId: taskId, status: lastStatus, value: lastStatus.state == TaskState.succeeded - ? _decodeTaskPayload(lastStatus.payload, decode) + ? _decodeTaskPayload( + lastStatus.payload, + decode, + decodeJson, + decodeVersionedJson, + ) : null, rawPayload: lastStatus.payload, ); @@ -440,7 +587,12 @@ class Stem implements TaskEnqueuer { taskId: taskId, status: status, value: status.state == TaskState.succeeded - ? _decodeTaskPayload(status.payload, decode) + ? _decodeTaskPayload( + status.payload, + decode, + decodeJson, + decodeVersionedJson, + ) : null, rawPayload: status.payload, timedOut: timedOut && !status.state.isTerminal, @@ -471,40 +623,6 @@ class Stem implements TaskEnqueuer { return completer.future; } - /// Waits for [taskId] using the decoding rules from a [TaskDefinition]. - Future?> waitForTaskDefinition< - TArgs, - TResult extends Object? - >( - String taskId, - TaskDefinition definition, { - Duration? timeout, - }) { - return waitForTask( - taskId, - timeout: timeout, - decode: (payload) { - TResult? value; - try { - value = definition.decode(payload); - } on Object { - if (payload is TResult) { - value = payload; - } else { - rethrow; - } - } - if (value == null && null is! TResult) { - throw StateError( - 'Task definition "${definition.name}" decoded a null result ' - 'for non-nullable type $TResult.', - ); - } - return value as TResult; - }, - ); - } - /// Executes the enqueue middleware chain in order. Future _runEnqueueMiddleware( Envelope envelope, @@ -540,7 +658,6 @@ class Stem implements TaskEnqueuer { /// handler defaults. int _resolveMaxRetries( TaskOptions options, - TaskOptions handlerOptions, TaskEnqueueOptions? enqueueOptions, ) { final policyMax = enqueueOptions?.retryPolicy?.maxRetries; @@ -554,7 +671,7 @@ class Stem implements TaskEnqueuer { if (options.maxRetries != 0) { return options.maxRetries; } - return handlerOptions.maxRetries; + return 0; } /// Maps enqueue-only settings into envelope metadata. @@ -880,14 +997,24 @@ class Stem implements TaskEnqueuer { /// Resolves the args encoder for a handler and registers it if needed. TaskPayloadEncoder _resolveArgsEncoder(TaskHandler handler) { - final encoder = handler.metadata.argsEncoder; - payloadEncoders.register(encoder); - return encoder ?? payloadEncoders.defaultArgsEncoder; + return _resolveArgsEncoderFromMetadata(handler.metadata); } /// Resolves the result encoder for a handler and registers it if needed. TaskPayloadEncoder _resolveResultEncoder(TaskHandler handler) { - final encoder = handler.metadata.resultEncoder; + return _resolveResultEncoderFromMetadata(handler.metadata); + } + + /// Resolves the args encoder for producer-side task metadata. + TaskPayloadEncoder _resolveArgsEncoderFromMetadata(TaskMetadata metadata) { + final encoder = metadata.argsEncoder; + payloadEncoders.register(encoder); + return encoder ?? payloadEncoders.defaultArgsEncoder; + } + + /// Resolves the result encoder for producer-side task metadata. + TaskPayloadEncoder _resolveResultEncoderFromMetadata(TaskMetadata metadata) { + final encoder = metadata.resultEncoder; payloadEncoders.register(encoder); return encoder ?? payloadEncoders.defaultResultEncoder; } @@ -956,28 +1083,234 @@ class Stem implements TaskEnqueuer { T? _decodeTaskPayload( Object? payload, T Function(Object? payload)? decode, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, ) { if (payload == null) return null; if (decode != null) { return decode(payload); } + if (decodeVersionedJson != null) { + return decodeVersionedJson( + PayloadCodec.decodeJsonMap(payload, typeName: 'task result'), + PayloadCodec.readPayloadVersion(payload), + ); + } + if (decodeJson != null) { + return decodeJson( + PayloadCodec.decodeJsonMap(payload, typeName: 'task result'), + ); + } return payload as T?; } } -/// Convenience helpers for enqueuing [TaskEnqueueBuilder] instances. -extension TaskEnqueueBuilderExtension - on TaskEnqueueBuilder { - /// Builds the call and enqueues it with the provided [enqueuer] instance. - Future enqueueWith(TaskEnqueuer enqueuer) { - final call = build(); - final scopeMeta = TaskEnqueueScope.currentMeta(); - if (scopeMeta == null || scopeMeta.isEmpty) { - return enqueuer.enqueueCall(call); +Map _encodeStemTaskValue( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'Task payload for $name must use string keys, got ' + '${key.runtimeType}.', + ); + } + normalized[key] = entry.value; } - final mergedMeta = Map.from(scopeMeta)..addAll(call.meta); + return normalized; + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); +} + +Future _enqueueBuiltTaskCall( + TaskEnqueuer enqueuer, + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, +}) { + final resolvedEnqueueOptions = enqueueOptions ?? call.enqueueOptions; + final scopeMeta = TaskEnqueueScope.currentMeta(); + if (scopeMeta == null || scopeMeta.isEmpty) { return enqueuer.enqueueCall( - call.copyWith(meta: Map.unmodifiable(mergedMeta)), + call, + enqueueOptions: resolvedEnqueueOptions, + ); + } + final mergedMeta = Map.from(scopeMeta)..addAll(call.meta); + return enqueuer.enqueueCall( + call.definition.buildCall( + call.args, + headers: call.headers, + options: call.options, + notBefore: call.notBefore, + meta: Map.unmodifiable(mergedMeta), + enqueueOptions: call.enqueueOptions, + ), + enqueueOptions: resolvedEnqueueOptions, + ); +} + +TResult _decodeTaskDefinitionResult( + TaskDefinition definition, + Object? payload, +) { + TResult? value; + try { + value = definition.decode(payload); + } on Object { + if (payload is TResult) { + value = payload; + } else { + rethrow; + } + } + if (value == null && null is! TResult) { + throw StateError( + 'Task definition "${definition.name}" decoded a null result ' + 'for non-nullable type $TResult.', + ); + } + return value as TResult; +} + + +/// Convenience helpers for waiting on typed task definitions. +extension TaskDefinitionExtension + on TaskDefinition { + /// Enqueues this typed task definition directly with [enqueuer]. + Future enqueue( + TaskEnqueuer enqueuer, + TArgs args, { + Map headers = const {}, + TaskOptions? options, + DateTime? notBefore, + Map? meta, + TaskEnqueueOptions? enqueueOptions, + }) { + return _enqueueBuiltTaskCall( + enqueuer, + buildCall( + args, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ), + enqueueOptions: enqueueOptions, + ); + } + + /// Enqueues this typed task definition and waits for its typed result. + Future?> enqueueAndWait( + TaskResultCaller caller, + TArgs args, { + Map headers = const {}, + TaskOptions? options, + DateTime? notBefore, + Map? meta, + TaskEnqueueOptions? enqueueOptions, + Duration? timeout, + }) { + final call = buildCall( + args, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + return _enqueueBuiltTaskCall( + caller, + call, + enqueueOptions: enqueueOptions, + ).then( + (taskId) => call.definition.waitFor( + caller, + taskId, + timeout: timeout, + ), + ); + } + + /// Waits for [taskId] using this definition's decoding rules. + Future?> waitFor( + TaskResultCaller caller, + String taskId, { + Duration? timeout, + }) { + return caller.waitForTask( + taskId, + timeout: timeout, + decode: (payload) => _decodeTaskDefinitionResult(this, payload), + ); + } +} + +/// Convenience helpers for waiting on typed no-arg task definitions. +extension NoArgsTaskDefinitionExtension + on NoArgsTaskDefinition { + /// Enqueues this no-arg task definition with [enqueuer]. + Future enqueue( + TaskEnqueuer enqueuer, { + Map headers = const {}, + TaskOptions? options, + DateTime? notBefore, + Map? meta, + TaskEnqueueOptions? enqueueOptions, + }) { + return _enqueueBuiltTaskCall( + enqueuer, + asDefinition.buildCall( + (), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ), + enqueueOptions: enqueueOptions, + ); + } + + /// Waits for [taskId] using this definition's decoding rules. + Future?> waitFor( + TaskResultCaller caller, + String taskId, { + Duration? timeout, + }) { + return asDefinition.waitFor(caller, taskId, timeout: timeout); + } + + /// Enqueues this no-arg task definition and waits for the typed result. + Future?> enqueueAndWait( + TaskResultCaller caller, { + Map headers = const {}, + TaskOptions? options, + DateTime? notBefore, + Map? meta, + TaskEnqueueOptions? enqueueOptions, + Duration? timeout, + }) async { + final taskId = await enqueue( + caller, + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, ); + return waitFor(caller, taskId, timeout: timeout); } } diff --git a/packages/stem/lib/src/core/task_invocation.dart b/packages/stem/lib/src/core/task_invocation.dart index e3f09695..834eb4e0 100644 --- a/packages/stem/lib/src/core/task_invocation.dart +++ b/packages/stem/lib/src/core/task_invocation.dart @@ -37,6 +37,12 @@ import 'dart:async'; import 'dart:isolate'; import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; /// Signature for task entrypoints that can run inside isolate executors. typedef TaskEntrypoint = @@ -75,6 +81,111 @@ class ProgressSignal extends TaskInvocationSignal { /// Optional progress metadata. final Map? data; + + /// Returns the decoded progress metadata value for [key], or `null`. + T? dataValue(String key, {PayloadCodec? codec}) { + final payload = data; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Returns the decoded progress metadata value for [key], or [fallback]. + T dataValueOr(String key, T fallback, {PayloadCodec? codec}) { + final payload = data; + if (payload == null) return fallback; + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded progress metadata value for [key], throwing if absent. + T requiredDataValue(String key, {PayloadCodec? codec}) { + final payload = data; + if (payload == null) { + throw StateError('Progress signal does not include metadata.'); + } + return payload.requiredValue(key, codec: codec); + } + + /// Decodes the progress metadata value for [key] as a typed DTO with [codec]. + T? dataAs(String key, {required PayloadCodec codec}) { + final payload = data; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Decodes the full progress payload as a typed DTO with [codec]. + T? payloadAs({required PayloadCodec codec}) { + final payload = data; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the progress metadata value for [key] as a typed DTO from JSON. + T? dataJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = data; + if (payload == null) return null; + return payload.valueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Decodes the full progress payload as a typed DTO from JSON. + T? payloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = data; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the progress metadata value for [key] as a typed DTO from + /// version-aware JSON. + T? dataVersionedJson( + String key, { + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = data; + if (payload == null) return null; + return payload.valueJson( + key, + decode: (json) => PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(json), + typeName: typeName, + ); + } + + /// Decodes the full progress payload as a typed DTO from version-aware JSON. + T? payloadVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = data; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } } /// Request to enqueue a task from an isolate. @@ -89,6 +200,42 @@ class EnqueueTaskSignal extends TaskInvocationSignal { final SendPort replyPort; } +/// Request to start a workflow from an isolate. +class StartWorkflowSignal extends TaskInvocationSignal { + /// Creates a workflow start request signal. + const StartWorkflowSignal(this.request, this.replyPort); + + /// Workflow start request payload. + final StartWorkflowRequest request; + + /// Port to deliver the response. + final SendPort replyPort; +} + +/// Request to wait for a workflow from an isolate. +class WaitForWorkflowSignal extends TaskInvocationSignal { + /// Creates a workflow wait request signal. + const WaitForWorkflowSignal(this.request, this.replyPort); + + /// Workflow wait request payload. + final WaitForWorkflowRequest request; + + /// Port to deliver the response. + final SendPort replyPort; +} + +/// Request to emit a workflow event from an isolate. +class EmitWorkflowEventSignal extends TaskInvocationSignal { + /// Creates a workflow event emit request signal. + const EmitWorkflowEventSignal(this.request, this.replyPort); + + /// Workflow event emit request payload. + final EmitWorkflowEventRequest request; + + /// Port to deliver the response. + final SendPort replyPort; +} + /// Enqueue request payload for isolate communication. class TaskEnqueueRequest { /// Creates an enqueue request payload. @@ -98,6 +245,7 @@ class TaskEnqueueRequest { required this.headers, required this.options, required this.meta, + this.notBefore, this.enqueueOptions, }); @@ -107,6 +255,38 @@ class TaskEnqueueRequest { /// Task arguments. final Map args; + /// Decodes the full task args payload as a typed DTO with [codec]. + T argsAs({required PayloadCodec codec}) { + return codec.decode(args); + } + + /// Decodes the full task args payload as a typed DTO from JSON. + T argsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(args); + } + + /// Decodes the full task args payload as a typed DTO from version-aware + /// JSON. + T argsVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(args); + } + /// Task headers. final Map headers; @@ -116,6 +296,41 @@ class TaskEnqueueRequest { /// Task metadata. final Map meta; + /// Decodes the full task metadata payload as a typed DTO with [codec]. + T metaAs({required PayloadCodec codec}) { + return codec.decode(meta); + } + + /// Decodes the full task metadata payload as a typed DTO from JSON. + T metaJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(meta); + } + + /// Decodes the full task metadata payload as a typed DTO from version-aware + /// JSON. + T metaVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(meta); + } + + /// Optional delay before execution. + final DateTime? notBefore; + /// Enqueue options. final Map? enqueueOptions; } @@ -132,8 +347,208 @@ class TaskEnqueueResponse { final String? error; } +/// Workflow start request payload for isolate communication. +class StartWorkflowRequest { + /// Creates a workflow start request payload. + const StartWorkflowRequest({ + required this.workflowName, + required this.params, + this.parentRunId, + this.ttlMs, + this.cancellationPolicy, + }); + + /// Workflow name to start. + final String workflowName; + + /// Encoded workflow params. + final Map params; + + /// Decodes the full workflow params payload as a typed DTO with [codec]. + T paramsAs({required PayloadCodec codec}) { + return codec.decode(params); + } + + /// Decodes the full workflow params payload as a typed DTO from JSON. + T paramsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(params); + } + + /// Decodes the full workflow params payload as a typed DTO from version-aware + /// JSON. + T paramsVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(params); + } + + /// Optional parent workflow run id. + final String? parentRunId; + + /// Optional run TTL in milliseconds. + final int? ttlMs; + + /// Optional serialized cancellation policy. + final Map? cancellationPolicy; +} + +/// Response payload for isolate workflow start requests. +class StartWorkflowResponse { + /// Creates a workflow start response payload. + const StartWorkflowResponse({this.runId, this.error}); + + /// Started workflow run id on success. + final String? runId; + + /// Error message when workflow start fails. + final String? error; +} + +/// Workflow wait request payload for isolate communication. +class WaitForWorkflowRequest { + /// Creates a workflow wait request payload. + const WaitForWorkflowRequest({ + required this.runId, + required this.workflowName, + this.pollIntervalMs, + this.timeoutMs, + }); + + /// Workflow run id to wait on. + final String runId; + + /// Workflow name used for result decoding. + final String workflowName; + + /// Poll interval in milliseconds. + final int? pollIntervalMs; + + /// Timeout in milliseconds. + final int? timeoutMs; +} + +/// Response payload for isolate workflow wait requests. +class WaitForWorkflowResponse { + /// Creates a workflow wait response payload. + const WaitForWorkflowResponse({this.result, this.error}); + + /// Serialized workflow result payload. + final Map? result; + + /// Decodes the workflow result payload as a typed DTO with [codec]. + T? resultAs({required PayloadCodec codec}) { + final payload = result; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the workflow result payload as a typed DTO from JSON. + T? resultJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = result; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the workflow result payload as a typed DTO from version-aware + /// JSON. + T? resultVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = result; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + + /// Error message when workflow wait fails. + final String? error; +} + +/// Workflow event emit request payload for isolate communication. +class EmitWorkflowEventRequest { + /// Creates a workflow event emit request payload. + const EmitWorkflowEventRequest({ + required this.topic, + required this.payload, + }); + + /// Workflow event topic to emit. + final String topic; + + /// Encoded workflow event payload. + final Map payload; + + /// Decodes the full workflow event payload as a typed DTO with [codec]. + T payloadAs({required PayloadCodec codec}) { + return codec.decode(payload); + } + + /// Decodes the full workflow event payload as a typed DTO from JSON. + T payloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full workflow event payload as a typed DTO from version-aware + /// JSON. + T payloadVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } +} + +/// Response payload for isolate workflow event emit requests. +class EmitWorkflowEventResponse { + /// Creates a workflow event emit response payload. + const EmitWorkflowEventResponse({this.error}); + + /// Error message when workflow event emission fails. + final String? error; +} + /// Context exposed to task entrypoints regardless of execution environment. -class TaskInvocationContext implements TaskEnqueuer { +class TaskInvocationContext implements TaskExecutionContext { /// Context implementation used when executing locally in the same isolate. factory TaskInvocationContext.local({ required String id, @@ -147,9 +562,13 @@ class TaskInvocationContext implements TaskEnqueuer { Map? data, }) progress, + Map args = const {}, TaskEnqueuer? enqueuer, + WorkflowCaller? workflows, + WorkflowEventEmitter? workflowEvents, }) => TaskInvocationContext._( id: id, + args: args, headers: headers, meta: meta, attempt: attempt, @@ -157,6 +576,8 @@ class TaskInvocationContext implements TaskEnqueuer { extendLease: extendLease, progress: progress, enqueuer: enqueuer, + workflows: workflows, + workflowEvents: workflowEvents, ); /// Context implementation used when executing inside a worker isolate. @@ -166,8 +587,10 @@ class TaskInvocationContext implements TaskEnqueuer { required Map headers, required Map meta, required int attempt, + Map args = const {}, }) => TaskInvocationContext._( id: id, + args: args, headers: headers, meta: meta, attempt: attempt, @@ -176,11 +599,14 @@ class TaskInvocationContext implements TaskEnqueuer { progress: (percent, {data}) async => controlPort.send(ProgressSignal(percent, data: data)), enqueuer: _RemoteTaskEnqueuer(controlPort), + workflows: _RemoteWorkflowCaller(controlPort), + workflowEvents: _RemoteWorkflowEventEmitter(controlPort), ); /// Internal constructor shared by local and isolate contexts. TaskInvocationContext._({ required this.id, + required this.args, required this.headers, required this.meta, required this.attempt, @@ -192,21 +618,32 @@ class TaskInvocationContext implements TaskEnqueuer { }) progress, TaskEnqueuer? enqueuer, + WorkflowCaller? workflows, + WorkflowEventEmitter? workflowEvents, }) : _heartbeat = heartbeat, _extendLease = extendLease, _progress = progress, - _enqueuer = enqueuer; + _enqueuer = enqueuer, + _workflows = workflows, + _workflowEvents = workflowEvents; /// The unique identifier of the task. + @override final String id; + @override + final Map args; + /// Headers passed to the task invocation. + @override final Map headers; /// Invocation metadata (e.g. trace, tenant). + @override final Map meta; /// Current attempt count. + @override final int attempt; final void Function() _heartbeat; @@ -220,13 +657,22 @@ class TaskInvocationContext implements TaskEnqueuer { /// Optional delegate used to enqueue tasks from within the invocation. final TaskEnqueuer? _enqueuer; + /// Optional delegate used to start child workflows from the invocation. + final WorkflowCaller? _workflows; + + /// Optional delegate used to emit workflow events from the invocation. + final WorkflowEventEmitter? _workflowEvents; + /// Notify the worker that the task is still running. + @override void heartbeat() => _heartbeat(); /// Request an extension of the underlying broker lease/visibility timeout. + @override Future extendLease(Duration by) => _extendLease(by); /// Report progress back to the worker. + @override Future progress(double percentComplete, {Map? data}) => _progress(percentComplete, data: data); @@ -241,6 +687,7 @@ class TaskInvocationContext implements TaskEnqueuer { Map args = const {}, Map headers = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, Map meta = const {}, TaskEnqueueOptions? enqueueOptions, }) async { @@ -269,11 +716,34 @@ class TaskInvocationContext implements TaskEnqueuer { args: args, headers: mergedHeaders, options: options, + notBefore: notBefore, meta: mergedMeta, enqueueOptions: enqueueOptions, ); } + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeInvocationEnqueuedValue(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + /// Enqueue a typed task call from within a task invocation. /// /// This merges headers/meta from the task call and applies lineage metadata @@ -303,9 +773,13 @@ class TaskInvocationContext implements TaskEnqueuer { mergedMeta.putIfAbsent('stem.rootTaskId', () => id); } - final mergedCall = call.copyWith( + final mergedCall = call.definition.buildCall( + call.args, headers: Map.unmodifiable(mergedHeaders), + options: call.options, + notBefore: call.notBefore, meta: Map.unmodifiable(mergedMeta), + enqueueOptions: call.enqueueOptions, ); return delegate.enqueueCall( mergedCall, @@ -313,22 +787,98 @@ class TaskInvocationContext implements TaskEnqueuer { ); } - /// Build a fluent enqueue request for this invocation. - /// - /// Use [TaskEnqueueBuilder.build] + [enqueueCall] to dispatch. - TaskEnqueueBuilder enqueueBuilder({ - required TaskDefinition definition, - required TArgs args, + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + final delegate = _workflows; + if (delegate == null) { + throw StateError( + 'TaskInvocationContext has no workflow caller configured', + ); + } + return delegate.startWorkflowRef( + definition, + params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) { + final delegate = _workflows; + if (delegate == null) { + throw StateError( + 'TaskInvocationContext has no workflow caller configured', + ); + } + return delegate.startWorkflowCall(call); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, }) { - return TaskEnqueueBuilder(definition: definition, args: args); + final delegate = _workflows; + if (delegate == null) { + throw StateError( + 'TaskInvocationContext has no workflow caller configured', + ); + } + return delegate.waitForWorkflowRef( + runId, + definition, + pollInterval: pollInterval, + timeout: timeout, + ); + } + + @override + Future emitValue( + String topic, + T value, { + PayloadCodec? codec, + }) { + final delegate = _workflowEvents; + if (delegate == null) { + throw StateError( + 'TaskInvocationContext has no workflow event emitter configured', + ); + } + return delegate.emitValue(topic, value, codec: codec); + } + + @override + Future emitEvent(WorkflowEventRef event, T value) { + final delegate = _workflowEvents; + if (delegate == null) { + throw StateError( + 'TaskInvocationContext has no workflow event emitter configured', + ); + } + return delegate.emitEvent(event, value); } /// Alias for enqueue. + @override Future spawn( String name, { Map args = const {}, Map headers = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, Map meta = const {}, TaskEnqueueOptions? enqueueOptions, }) { @@ -337,6 +887,7 @@ class TaskInvocationContext implements TaskEnqueuer { args: args, headers: headers, options: options, + notBefore: notBefore, meta: meta, enqueueOptions: enqueueOptions, ); @@ -347,6 +898,7 @@ class TaskInvocationContext implements TaskEnqueuer { /// Throws a [TaskRetryRequest] which is intercepted by the worker to /// schedule the retry. Override retry policies/time limits per invocation /// by passing the optional parameters. + @override Future retry({ Duration? countdown, DateTime? eta, @@ -379,6 +931,7 @@ class _RemoteTaskEnqueuer implements TaskEnqueuer { Map args = const {}, Map headers = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, Map meta = const {}, TaskEnqueueOptions? enqueueOptions, }) async { @@ -391,6 +944,7 @@ class _RemoteTaskEnqueuer implements TaskEnqueuer { args: args, headers: headers, options: options.toJson(), + notBefore: notBefore, meta: meta, enqueueOptions: enqueueOptions?.toJson(), ), @@ -423,4 +977,203 @@ class _RemoteTaskEnqueuer implements TaskEnqueuer { enqueueOptions: enqueueOptions ?? call.enqueueOptions, ); } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeInvocationEnqueuedValue(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } +} + +Map _encodeInvocationEnqueuedValue( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'Task payload for $name must use string keys, got ' + '${key.runtimeType}.', + ); + } + normalized[key] = entry.value; + } + return normalized; + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); +} + +class _RemoteWorkflowCaller implements WorkflowCaller { + _RemoteWorkflowCaller(this._controlPort); + + final SendPort _controlPort; + + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) async { + final responsePort = ReceivePort(); + _controlPort.send( + StartWorkflowSignal( + StartWorkflowRequest( + workflowName: definition.name, + params: definition.encodeParams(params), + parentRunId: parentRunId, + ttlMs: ttl?.inMilliseconds, + cancellationPolicy: cancellationPolicy?.toJson(), + ), + responsePort.sendPort, + ), + ); + final response = await responsePort.first; + responsePort.close(); + if (response is StartWorkflowResponse) { + if (response.error != null) { + throw StateError(response.error!); + } + return response.runId ?? ''; + } + throw StateError('Unexpected workflow start response: $response'); + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) { + return startWorkflowRef( + call.definition, + call.params, + parentRunId: call.parentRunId, + ttl: call.ttl, + cancellationPolicy: call.cancellationPolicy, + ); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) async { + final responsePort = ReceivePort(); + _controlPort.send( + WaitForWorkflowSignal( + WaitForWorkflowRequest( + runId: runId, + workflowName: definition.name, + pollIntervalMs: pollInterval.inMilliseconds, + timeoutMs: timeout?.inMilliseconds, + ), + responsePort.sendPort, + ), + ); + final response = await responsePort.first; + responsePort.close(); + if (response is WaitForWorkflowResponse) { + if (response.error != null) { + throw StateError(response.error!); + } + final resultJson = response.result; + if (resultJson == null) { + return null; + } + final raw = WorkflowResult.fromJson(resultJson); + return WorkflowResult( + runId: raw.runId, + status: raw.status, + state: raw.state, + value: raw.rawResult == null ? null : definition.decode(raw.rawResult), + rawResult: raw.rawResult, + timedOut: raw.timedOut, + ); + } + throw StateError('Unexpected workflow wait response: $response'); + } +} + +class _RemoteWorkflowEventEmitter implements WorkflowEventEmitter { + _RemoteWorkflowEventEmitter(this._controlPort); + + final SendPort _controlPort; + + @override + Future emitValue( + String topic, + T value, { + PayloadCodec? codec, + }) async { + final encoded = codec != null ? codec.encodeDynamic(value) : value; + if (encoded is! Map) { + throw StateError( + 'TaskInvocationContext workflow events must encode to ' + 'Map, got ${encoded.runtimeType}.', + ); + } + final payload = {}; + for (final entry in encoded.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'TaskInvocationContext workflow event payload keys must be strings, ' + 'got ${key.runtimeType}.', + ); + } + payload[key] = entry.value; + } + + final responsePort = ReceivePort(); + _controlPort.send( + EmitWorkflowEventSignal( + EmitWorkflowEventRequest(topic: topic, payload: payload), + responsePort.sendPort, + ), + ); + final response = await responsePort.first; + responsePort.close(); + if (response is EmitWorkflowEventResponse) { + if (response.error != null) { + throw StateError(response.error!); + } + return; + } + throw StateError('Unexpected workflow event response: $response'); + } + + @override + Future emitEvent(WorkflowEventRef event, T value) { + return emitValue(event.topic, value, codec: event.codec); + } } diff --git a/packages/stem/lib/src/core/task_result.dart b/packages/stem/lib/src/core/task_result.dart index c0efbb62..a92a03e2 100644 --- a/packages/stem/lib/src/core/task_result.dart +++ b/packages/stem/lib/src/core/task_result.dart @@ -1,4 +1,5 @@ import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/core/stem.dart' show Stem; import 'package:stem/stem.dart' show Stem; @@ -23,6 +24,58 @@ class TaskResult { /// Decoded payload when the task succeeded. final T? value; + /// Returns [value] or [fallback] when the task has no decoded result. + T valueOr(T fallback) => value ?? fallback; + + /// Returns the decoded value, throwing when it is absent. + T requiredValue() { + final resolved = value; + if (resolved == null) { + throw StateError( + "Task '$taskId' does not have a decoded result value.", + ); + } + return resolved; + } + + /// Decodes the raw persisted task payload with [codec]. + TResult? payloadAs({required PayloadCodec codec}) { + final stored = rawPayload; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the raw persisted task payload with a JSON decoder. + TResult? payloadJson({ + required TResult Function(Map payload) decode, + String? typeName, + }) { + final stored = rawPayload; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the raw persisted task payload with a version-aware JSON + /// decoder. + TResult? payloadVersionedJson({ + required int version, + required TResult Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = rawPayload; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Raw payload stored by the backend (useful for debugging or manual casts). final Object? rawPayload; diff --git a/packages/stem/lib/src/observability/heartbeat.dart b/packages/stem/lib/src/observability/heartbeat.dart index e9e74145..9acfff1b 100644 --- a/packages/stem/lib/src/observability/heartbeat.dart +++ b/packages/stem/lib/src/observability/heartbeat.dart @@ -1,5 +1,8 @@ import 'dart:convert'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; + /// Structured payload describing worker state for external monitoring systems. class WorkerHeartbeat { /// Captures the current worker state at [timestamp] using optional [extras]. @@ -68,6 +71,53 @@ class WorkerHeartbeat { /// Additional metadata for downstream consumers. final Map extras; + /// Decodes the full extras payload as a typed DTO with [codec]. + T extrasAs({required PayloadCodec codec}) { + return codec.decode(extras); + } + + /// Decodes the full extras payload as a typed DTO with a JSON decoder. + T extrasJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(extras); + } + + /// Decodes the full extras payload as a typed DTO with a version-aware JSON + /// decoder. + T extrasVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(extras); + } + + /// Returns the decoded extras value for [key], or `null` when absent. + T? extraValue(String key, {PayloadCodec? codec}) { + return extras.value(key, codec: codec); + } + + /// Returns the decoded extras value for [key], or [fallback] when absent. + T extraValueOr(String key, T fallback, {PayloadCodec? codec}) { + return extras.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded extras value for [key], throwing when absent. + T requiredExtraValue(String key, {PayloadCodec? codec}) { + return extras.requiredValue(key, codec: codec); + } + /// Serializes this heartbeat into a JSON-ready map for transport or storage. Map toJson() => { 'workerId': workerId, diff --git a/packages/stem/lib/src/observability/logging.dart b/packages/stem/lib/src/observability/logging.dart index 7d820031..c732c292 100644 --- a/packages/stem/lib/src/observability/logging.dart +++ b/packages/stem/lib/src/observability/logging.dart @@ -1,18 +1,43 @@ +import 'dart:convert'; + +import 'package:ansicolor/ansicolor.dart'; import 'package:contextual/contextual.dart'; -Logger _buildStemLogger() { - return Logger()..addChannel( - 'console', - ConsoleLogDriver(), - formatter: PlainTextLogFormatter( - settings: FormatterSettings( - includePrefix: false, - ), - ), - ); +/// Available output formats for the shared Stem logger. +enum StemLogFormat { + /// Plain logfmt-style output without ANSI color codes. + plain, + + /// Colored terminal output intended for interactive local development. + pretty, +} + +/// Creates a formatter matching the shared Stem logging presets. +LogMessageFormatter createStemLogFormatter(StemLogFormat format) { + final settings = FormatterSettings(includePrefix: false); + return switch (format) { + StemLogFormat.pretty => _StemPrettyLogFormatter(settings: settings), + StemLogFormat.plain => PlainTextLogFormatter(settings: settings), + }; +} + +/// Creates a logger configured the same way Stem configures its shared logger. +Logger createStemLogger({ + Level level = Level.info, + StemLogFormat format = StemLogFormat.pretty, + bool enableConsole = false, +}) { + final logger = Logger( + formatter: createStemLogFormatter(format), + defaultChannelEnabled: false, + )..setLevel(level); + if (enableConsole) { + logger.addChannel('console', ConsoleLogDriver()); + } + return logger; } -Logger _stemLogger = _buildStemLogger(); +Logger _stemLogger = createStemLogger(); /// Shared logger configured with console output suitable for worker /// diagnostics. @@ -52,6 +77,224 @@ Context stemLogContext({ } /// Sets the minimum log [level] for the shared [stemLogger]. -void configureStemLogging({Level level = Level.info}) { +void configureStemLogging({ + Level level = Level.info, + StemLogFormat? format, + bool enableConsole = true, +}) { stemLogger.setLevel(level); + if (format != null) { + stemLogger.formatter(createStemLogFormatter(format)); + } + if (enableConsole) { + stemLogger.addChannel('console', ConsoleLogDriver()); + } else { + stemLogger.removeChannel('console'); + } +} + +class _StemPrettyLogFormatter extends LogMessageFormatter { + _StemPrettyLogFormatter({super.settings}); + + static final AnsiPen _keyPen = AnsiPen()..blue(bold: true); + static final AnsiPen _timestampPen = AnsiPen()..blue(); + static final AnsiPen _contextKeyPen = AnsiPen()..magenta(bold: true); + static final AnsiPen _prefixPen = AnsiPen()..cyan(); + static final AnsiPen _stackTracePen = AnsiPen()..red(); + + @override + String format(LogRecord record) { + final levelPen = _levelPen(record.level); + final parts = []; + + if (settings.includeTimestamp) { + final timestamp = settings.formatTimestamp(record.time); + parts.add( + '${_keyPen('time')}=${_timestampPen(_formatLogfmtValue(timestamp))}', + ); + } + + if (settings.includeLevel) { + parts.add( + '${_keyPen('level')}=' + '${levelPen(_formatLogfmtValue(record.level.name))}', + ); + } + + if (settings.includePrefix && record.context.has('prefix')) { + final prefix = record.context.get('prefix'); + parts.add( + '${_keyPen('prefix')}=${_prefixPen(_formatLogfmtValue(prefix))}', + ); + } + + final formattedMessage = _interpolateStemMessage( + record.message, + record.context, + ); + parts.add('${_keyPen('msg')}=${_formatLogfmtValue(formattedMessage)}'); + + final contextData = settings.includeHidden + ? record.context.all() + : record.context.visible(); + if (settings.includeContext && contextData.isNotEmpty) { + final contextEntries = Map.from(contextData); + if (settings.includePrefix) { + contextEntries.remove('prefix'); + } + final flattened = _flattenLogfmtContext(contextEntries); + for (final entry in flattened.entries) { + parts.add( + '${_contextKeyPen(_formatLogfmtKey(entry.key))}' + '=${_formatLogfmtValue(entry.value)}', + ); + } + } + + if (record.stackTraceProvided && record.stackTrace != null) { + parts.add( + '${_keyPen('stackTrace')}=' + '${_stackTracePen(_formatLogfmtValue(record.stackTrace.toString()))}', + ); + } + + return parts.join(' '); + } + + AnsiPen _levelPen(Level level) { + return switch (level) { + Level.debug => AnsiPen()..blue(), + Level.info => AnsiPen()..green(), + Level.notice => AnsiPen()..cyan(), + Level.warning => AnsiPen()..yellow(), + Level.error => AnsiPen()..red(), + Level.alert || Level.emergency => AnsiPen()..red(bold: true), + _ => AnsiPen()..white(), + }; + } +} + +String _interpolateStemMessage(String message, Context context) { + var resolved = message; + final placeholderPattern = RegExp(r'\{([^}]+)\}'); + final matches = placeholderPattern.allMatches(resolved).toList(); + + for (final match in matches) { + final rawKey = match.group(1)!; + final value = _dotLookup(context.all(), rawKey)?.toString(); + if (value == null) continue; + if (!resolved.contains('{$rawKey}')) continue; + resolved = resolved.replaceAll('{$rawKey}', value); + } + + return resolved; +} + +Object? _dotLookup(Map source, String key) { + final segments = key.split('.'); + Object? current = source; + for (final segment in segments) { + if (current is! Map) return null; + current = current[segment]; + } + return current; +} + +final _logfmtKeyChar = RegExp('[A-Za-z0-9_.:-]'); + +String _formatLogfmtKey(String key) { + if (key.isEmpty) { + return 'context'; + } + final buffer = StringBuffer(); + for (final rune in key.runes) { + final char = String.fromCharCode(rune); + buffer.write(_logfmtKeyChar.hasMatch(char) ? char : '_'); + } + return buffer.toString(); +} + +String _formatLogfmtValue(Object? value) { + final raw = _stringifyLogfmtValue(value); + if (_needsLogfmtQuoting(raw)) { + return '"${_escapeLogfmt(raw)}"'; + } + return raw; +} + +Map _flattenLogfmtContext( + Map context, { + String prefix = '', +}) { + final flattened = {}; + + void addEntry(String key, dynamic value) { + final fullKey = prefix.isEmpty ? key : '$prefix$key'; + if (value is Map) { + value.forEach((nestedKey, nestedValue) { + final nestedKeyString = nestedKey?.toString() ?? ''; + final combinedKey = fullKey.isEmpty + ? nestedKeyString + : '$fullKey.$nestedKeyString'; + flattened.addAll( + _flattenLogfmtContext({combinedKey: nestedValue}), + ); + }); + return; + } + flattened[fullKey] = value; + } + + context.forEach(addEntry); + return flattened; +} + +String _stringifyLogfmtValue(Object? value) { + if (value == null) return 'null'; + if (value is String) return value; + if (value is num || value is bool) return value.toString(); + if (value is DateTime) return value.toIso8601String(); + if (value is Map || value is Iterable) { + try { + return jsonEncode(value); + } on Object { + return value.toString(); + } + } + return value.toString(); +} + +bool _needsLogfmtQuoting(String value) { + if (value.isEmpty) return true; + for (var i = 0; i < value.length; i++) { + final code = value.codeUnitAt(i); + if (code == 0x20 || code == 0x09 || code == 0x0A || code == 0x0D) { + return true; + } + if (code == 0x22 || code == 0x5C || code == 0x3D) { + return true; + } + } + return false; +} + +String _escapeLogfmt(String value) { + final buffer = StringBuffer(); + for (final rune in value.runes) { + switch (rune) { + case 0x22: + buffer.write(r'\"'); + case 0x5C: + buffer.write(r'\\'); + case 0x0A: + buffer.write(r'\n'); + case 0x0D: + buffer.write(r'\r'); + case 0x09: + buffer.write(r'\t'); + default: + buffer.writeCharCode(rune); + } + } + return buffer.toString(); } diff --git a/packages/stem/lib/src/signals/payloads.dart b/packages/stem/lib/src/signals/payloads.dart index c334d748..d4c1a4d4 100644 --- a/packages/stem/lib/src/signals/payloads.dart +++ b/packages/stem/lib/src/signals/payloads.dart @@ -2,6 +2,8 @@ import 'package:stem/src/control/control_messages.dart'; import 'package:stem/src/core/clock.dart'; import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; import 'package:stem/src/core/stem_event.dart'; /// Status of a workflow run emitted via signals. @@ -211,6 +213,43 @@ class TaskPostrunPayload implements StemEvent { /// The result returned by the task. final Object? result; + /// Decodes the task result with [codec]. + TResult? resultAs({required PayloadCodec codec}) { + final stored = result; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the task result with a JSON decoder. + TResult? resultJson({ + required TResult Function(Map payload) decode, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the task result with a version-aware JSON decoder. + TResult? resultVersionedJson({ + required int version, + required TResult Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// The final state of the task. final TaskState state; @@ -315,6 +354,43 @@ class TaskSuccessPayload implements StemEvent { /// The result returned by the successful task. final Object? result; + /// Decodes the task result with [codec]. + TResult? resultAs({required PayloadCodec codec}) { + final stored = result; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the task result with a JSON decoder. + TResult? resultJson({ + required TResult Function(Map payload) decode, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the task result with a version-aware JSON decoder. + TResult? resultVersionedJson({ + required int version, + required TResult Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + final DateTime _occurredAt; /// The unique identifier for the task. @@ -555,6 +631,95 @@ class WorkflowRunPayload implements StemEvent { /// Additional metadata associated with the workflow run. final Map metadata; + /// Returns the decoded metadata value for [key], or `null` when absent. + /// + /// When [codec] is supplied, the stored durable payload is decoded through + /// that codec before being returned. + T? metadataValue(String key, {PayloadCodec? codec}) { + return metadata.value(key, codec: codec); + } + + /// Decodes the metadata value for [key] as a typed DTO with [codec]. + T? metadataAs(String key, {required PayloadCodec codec}) { + return metadata.value(key, codec: codec); + } + + /// Decodes the full metadata payload as a typed DTO with [codec]. + T metadataPayloadAs({required PayloadCodec codec}) { + return codec.decode(metadata); + } + + /// Decodes the metadata value for [key] as a typed DTO with a JSON decoder. + T? metadataJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return metadata.valueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Decodes the full metadata payload as a typed DTO with a JSON decoder. + T metadataPayloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(metadata); + } + + /// Decodes the metadata value for [key] as a typed DTO with a version-aware + /// JSON decoder. + T? metadataVersionedJson( + String key, { + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return metadata.valueJson( + key, + decode: (json) => PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(json), + typeName: typeName, + ); + } + + /// Decodes the full metadata payload as a typed DTO with a version-aware + /// JSON decoder. + T metadataPayloadVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(metadata); + } + + /// Returns the decoded metadata value for [key], or [fallback] when absent. + T metadataValueOr(String key, T fallback, {PayloadCodec? codec}) { + return metadata.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded metadata value for [key], throwing when absent. + T requiredMetadataValue(String key, {PayloadCodec? codec}) { + return metadata.requiredValue(key, codec: codec); + } + /// Optional canonical signal name when this payload is emitted. final String? signalName; @@ -743,9 +908,135 @@ class ControlCommandCompletedPayload implements StemEvent { /// The response data from the command execution, if any. final Map? response; + /// Returns the decoded response value for [key], or `null` when absent. + T? responseValue(String key, {PayloadCodec? codec}) { + final payload = response; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Returns the decoded response value for [key], or [fallback] when absent. + T responseValueOr(String key, T fallback, {PayloadCodec? codec}) { + final payload = response; + if (payload == null) return fallback; + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded response value for [key], throwing when absent. + T requiredResponseValue(String key, {PayloadCodec? codec}) { + final payload = response; + if (payload == null) { + throw StateError( + 'ControlCommandCompletedPayload.response does not contain "$key".', + ); + } + return payload.requiredValue(key, codec: codec); + } + + /// Decodes the full response payload as a typed DTO with [codec]. + T? responseAs({required PayloadCodec codec}) { + final payload = response; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the full response payload as a typed DTO with a JSON decoder. + T? responseJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = response; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full response payload as a typed DTO with a version-aware + /// JSON decoder. + T? responseVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = response; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + /// Error information if the command failed, if any. final Map? error; + /// Returns the decoded error value for [key], or `null` when absent. + T? errorValue(String key, {PayloadCodec? codec}) { + final payload = error; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Returns the decoded error value for [key], or [fallback] when absent. + T errorValueOr(String key, T fallback, {PayloadCodec? codec}) { + final payload = error; + if (payload == null) return fallback; + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded error value for [key], throwing when absent. + T requiredErrorValue(String key, {PayloadCodec? codec}) { + final payload = error; + if (payload == null) { + throw StateError( + 'ControlCommandCompletedPayload.error does not contain "$key".', + ); + } + return payload.requiredValue(key, codec: codec); + } + + /// Decodes the full error payload as a typed DTO with [codec]. + T? errorAs({required PayloadCodec codec}) { + final payload = error; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the full error payload as a typed DTO with a JSON decoder. + T? errorJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = error; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full error payload as a typed DTO with a version-aware JSON + /// decoder. + T? errorVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = error; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + final DateTime _occurredAt; @override diff --git a/packages/stem/lib/src/worker/isolate_messages.dart b/packages/stem/lib/src/worker/isolate_messages.dart index b10d4c5f..b0a027e7 100644 --- a/packages/stem/lib/src/worker/isolate_messages.dart +++ b/packages/stem/lib/src/worker/isolate_messages.dart @@ -179,6 +179,7 @@ void taskWorkerIsolate(SendPort handshakePort) { if (message is TaskRunRequest) { final invocationContext = TaskInvocationContext.remote( id: message.id, + args: message.args, controlPort: message.controlPort, headers: message.headers, meta: message.meta, diff --git a/packages/stem/lib/src/worker/worker.dart b/packages/stem/lib/src/worker/worker.dart index 8e58795d..9565fa91 100644 --- a/packages/stem/lib/src/worker/worker.dart +++ b/packages/stem/lib/src/worker/worker.dart @@ -106,6 +106,8 @@ import 'package:stem/src/core/clock.dart'; import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/encoder_keys.dart'; import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; import 'package:stem/src/core/retry.dart'; import 'package:stem/src/core/stem.dart'; import 'package:stem/src/core/stem_event.dart'; @@ -123,6 +125,9 @@ import 'package:stem/src/signals/emitter.dart'; import 'package:stem/src/signals/payloads.dart'; import 'package:stem/src/worker/isolate_pool.dart'; import 'package:stem/src/worker/worker_config.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; /// Shutdown modes for workers. /// @@ -270,6 +275,10 @@ class Worker { /// is created and populated from [tasks]. /// - [enqueuer]: [Stem] instance for spawning child tasks from handlers. /// Created automatically if not provided. + /// - [workflows]: Optional workflow caller used when task handlers need to + /// start or wait for child workflows. + /// - [workflowEvents]: Optional workflow event emitter used when task + /// handlers need to resume waiting workflows by topic or typed event ref. /// - [rateLimiter]: Enforces per-task rate limits. Rate limits are defined /// on individual handlers via [TaskOptions.rateLimit]. /// - [middleware]: List of middleware for intercepting task lifecycle events. @@ -304,6 +313,8 @@ class Worker { Iterable> tasks = const [], TaskRegistry? registry, Stem? enqueuer, + WorkflowCaller? workflows, + WorkflowEventEmitter? workflowEvents, RateLimiter? rateLimiter, List middleware = const [], RevokeStore? revokeStore, @@ -330,6 +341,8 @@ class Worker { }) : this._( broker: broker, enqueuer: enqueuer, + workflows: workflows, + workflowEvents: workflowEvents, registry: _resolveTaskRegistry(registry, tasks), backend: backend, rateLimiter: rateLimiter, @@ -362,6 +375,8 @@ class Worker { required this.registry, required this.backend, required Stem? enqueuer, + this.workflows, + this.workflowEvents, this.rateLimiter, this.middleware = const [], this.revokeStore, @@ -425,7 +440,6 @@ class Worker { signer: signer, encoderRegistry: payloadEncoders, ); - _maxConcurrency = this.concurrency; final autoscaleMax = @@ -551,6 +565,13 @@ class Worker { /// Enqueuer used by task contexts for spawning new work. Stem? _enqueuer; + /// Workflow caller used by task contexts for child workflow operations. + /// Active workflow caller used by task handlers, if configured. + WorkflowCaller? workflows; + + /// Workflow event emitter used by task contexts for workflow resumes. + WorkflowEventEmitter? workflowEvents; + static final math.Random _random = math.Random(); /// Resolved routing subscription for this worker. @@ -949,6 +970,7 @@ class Worker { final context = TaskContext( id: envelope.id, + args: envelope.args, attempt: envelope.attempt, headers: envelope.headers, meta: envelope.meta, @@ -968,6 +990,8 @@ class Worker { _reportProgress(envelope, progress, data: data); }, enqueuer: _enqueuer, + workflows: workflows, + workflowEvents: workflowEvents, ); await _signals.taskPrerun(envelope, _workerInfoSnapshot, context); @@ -1938,7 +1962,7 @@ class Worker { envelope, extra: { 'error': error.message, - if (error.keyId != null) 'keyId': error.keyId!, + if (error.keyId != null) 'keyId': error.keyId, }, ), ), @@ -4493,6 +4517,7 @@ class Worker { signal.request.enqueueOptions!.cast(), ) : null; + final notBefore = signal.request.notBefore; final enqueuer = _enqueuer; if (enqueuer == null) { signal.replyPort.send( @@ -4505,6 +4530,7 @@ class Worker { args: signal.request.args, headers: signal.request.headers, options: options, + notBefore: notBefore, meta: signal.request.meta, enqueueOptions: enqueueOptions, ); @@ -4514,10 +4540,97 @@ class Worker { TaskEnqueueResponse(error: error.toString()), ); } + } else if (signal is StartWorkflowSignal) { + try { + final workflows = this.workflows; + if (workflows == null) { + signal.replyPort.send( + const StartWorkflowResponse( + error: 'No workflow caller configured', + ), + ); + return; + } + final runId = await workflows.startWorkflowRef( + _workerWorkflowRef(signal.request.workflowName), + signal.request.params, + parentRunId: signal.request.parentRunId, + ttl: signal.request.ttlMs == null + ? null + : Duration(milliseconds: signal.request.ttlMs!), + cancellationPolicy: WorkflowCancellationPolicy.fromJson( + signal.request.cancellationPolicy, + ), + ); + signal.replyPort.send(StartWorkflowResponse(runId: runId)); + } on Exception catch (error) { + signal.replyPort.send( + StartWorkflowResponse(error: error.toString()), + ); + } + } else if (signal is WaitForWorkflowSignal) { + try { + final workflows = this.workflows; + if (workflows == null) { + signal.replyPort.send( + const WaitForWorkflowResponse( + error: 'No workflow caller configured', + ), + ); + return; + } + final result = await workflows.waitForWorkflowRef( + signal.request.runId, + _workerWorkflowRef(signal.request.workflowName), + pollInterval: signal.request.pollIntervalMs == null + ? const Duration(milliseconds: 100) + : Duration(milliseconds: signal.request.pollIntervalMs!), + timeout: signal.request.timeoutMs == null + ? null + : Duration(milliseconds: signal.request.timeoutMs!), + ); + signal.replyPort.send( + WaitForWorkflowResponse( + result: result?.toJson(), + ), + ); + } on Exception catch (error) { + signal.replyPort.send( + WaitForWorkflowResponse(error: error.toString()), + ); + } + } else if (signal is EmitWorkflowEventSignal) { + try { + final workflowEvents = this.workflowEvents; + if (workflowEvents == null) { + signal.replyPort.send( + const EmitWorkflowEventResponse( + error: 'No workflow event emitter configured', + ), + ); + return; + } + await workflowEvents.emitValue>( + signal.request.topic, + signal.request.payload, + ); + signal.replyPort.send(const EmitWorkflowEventResponse()); + } on Exception catch (error) { + signal.replyPort.send( + EmitWorkflowEventResponse(error: error.toString()), + ); + } } }; } + WorkflowRef, Object?> _workerWorkflowRef(String name) { + return WorkflowRef, Object?>( + name: name, + encodeParams: (params) => params, + ); + } + /// Lazily creates or returns the worker isolate pool. Future _ensureIsolatePool() { final existing = _isolatePool; @@ -4661,6 +4774,67 @@ class WorkerEvent implements StemEvent { /// Additional data for the event. final Map? data; + /// Returns the decoded data value for [key], or `null` when absent. + T? dataValue(String key, {PayloadCodec? codec}) { + final payload = data; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Returns the decoded data value for [key], or [fallback] when absent. + T dataValueOr(String key, T fallback, {PayloadCodec? codec}) { + final payload = data; + if (payload == null) return fallback; + return payload.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded data value for [key], throwing when absent. + T requiredDataValue(String key, {PayloadCodec? codec}) { + final payload = data; + if (payload == null) { + throw StateError('WorkerEvent.data does not contain "$key".'); + } + return payload.requiredValue(key, codec: codec); + } + + /// Decodes the full data payload as a typed DTO with [codec]. + T? dataAs({required PayloadCodec codec}) { + final payload = data; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the full data payload as a typed DTO with a JSON decoder. + T? dataJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = data; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full data payload as a typed DTO with a version-aware JSON + /// decoder. + T? dataVersionedJson({ + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = data; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + @override String get eventName => 'worker.${type.name}'; diff --git a/packages/stem/lib/src/workflow/core/flow.dart b/packages/stem/lib/src/workflow/core/flow.dart index 5a5cdd97..000d213d 100644 --- a/packages/stem/lib/src/workflow/core/flow.dart +++ b/packages/stem/lib/src/workflow/core/flow.dart @@ -1,5 +1,8 @@ import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; import 'package:stem/src/workflow/core/workflow_definition.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; /// Convenience wrapper that builds a [WorkflowDefinition] using the declarative /// [FlowBuilder] DSL. @@ -17,6 +20,8 @@ class Flow { String? description, Map? metadata, PayloadCodec? resultCodec, + T Function(Map payload)? decodeResultJson, + String? resultTypeName, }) : definition = WorkflowDefinition.flow( name: name, build: build, @@ -24,8 +29,336 @@ class Flow { description: description, metadata: metadata, resultCodec: resultCodec, + decodeResultJson: decodeResultJson, + resultTypeName: resultTypeName, ); + /// Creates a flow definition whose final result uses a custom payload codec. + factory Flow.codec({ + required String name, + required void Function(FlowBuilder builder) build, + required PayloadCodec resultCodec, + String? version, + String? description, + Map? metadata, + }) { + return Flow( + name: name, + build: build, + version: version, + description: description, + metadata: metadata, + resultCodec: resultCodec, + ); + } + + /// Creates a flow definition whose final result is a DTO-backed JSON value. + factory Flow.json({ + required String name, + required void Function(FlowBuilder builder) build, + required T Function(Map payload) decodeResult, + String? version, + String? description, + Map? metadata, + String? resultTypeName, + }) { + return Flow( + name: name, + build: build, + version: version, + description: description, + metadata: metadata, + decodeResultJson: decodeResult, + resultTypeName: resultTypeName, + ); + } + + /// Creates a flow definition whose final result is a versioned DTO-backed + /// JSON value. + factory Flow.versionedJson({ + required String name, + required void Function(FlowBuilder builder) build, + required int version, + required T Function(Map payload, int version) decodeResult, + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return Flow( + name: name, + build: build, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedJson( + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a flow definition whose final result uses a reusable version + /// registry. + factory Flow.versionedJsonRegistry({ + required String name, + required void Function(FlowBuilder builder) build, + required int version, + required PayloadVersionRegistry resultRegistry, + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return Flow( + name: name, + build: build, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a flow definition whose final result is a versioned custom map + /// payload. + factory Flow.versionedMap({ + required String name, + required void Function(FlowBuilder builder) build, + required Object? Function(T value) encodeResult, + required int version, + required T Function(Map payload, int version) decodeResult, + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return Flow( + name: name, + build: build, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedMap( + encode: encodeResult, + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a flow definition whose final result is a versioned custom map + /// payload decoded through a reusable registry. + factory Flow.versionedMapRegistry({ + required String name, + required void Function(FlowBuilder builder) build, + required Object? Function(T value) encodeResult, + required int version, + required PayloadVersionRegistry resultRegistry, + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return Flow( + name: name, + build: build, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedMapRegistry( + encode: encodeResult, + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + /// The constructed workflow definition. final WorkflowDefinition definition; + + /// Builds a typed [WorkflowRef] using this flow's registered workflow name + /// and result decoder. + WorkflowRef ref({ + required Map Function(TParams params) encodeParams, + }) { + return definition.ref(encodeParams: encodeParams); + } + + /// Builds a typed [WorkflowRef] backed by a DTO [paramsCodec]. + WorkflowRef refCodec({ + required PayloadCodec paramsCodec, + }) { + return definition.refCodec(paramsCodec: paramsCodec); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()`. + WorkflowRef refJson({ + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refJson( + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()` and persist a schema [version] beside the payload. + WorkflowRef refVersionedJson({ + required int version, + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedJson( + version: version, + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()` and decode versioned results through a reusable registry. + WorkflowRef refVersionedJsonRegistry({ + required int version, + required PayloadVersionRegistry resultRegistry, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedJsonRegistry( + version: version, + resultRegistry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for custom map params that persist a schema + /// [version] beside the payload. + WorkflowRef refVersionedMap({ + required Object? Function(TParams params) encodeParams, + required int version, + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedMap( + encodeParams: encodeParams, + version: version, + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for custom map params that persist a schema + /// [version] and decode versioned results through a reusable registry. + WorkflowRef refVersionedMapRegistry({ + required Object? Function(TParams params) encodeParams, + required int version, + required PayloadVersionRegistry resultRegistry, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedMapRegistry( + encodeParams: encodeParams, + version: version, + resultRegistry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [NoArgsWorkflowRef] for flows without start params. + NoArgsWorkflowRef ref0() { + return definition.ref0(); + } + + /// Starts this flow directly when it does not accept start params. + Future start( + WorkflowCaller caller, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + return ref0().start( + caller, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + /// Starts this flow directly and waits for completion. + Future?> startAndWait( + WorkflowCaller caller, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return ref0().startAndWait( + caller, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + pollInterval: pollInterval, + timeout: timeout, + ); + } + + /// Waits for [runId] using this flow's result decoding rules. + Future?> waitFor( + WorkflowCaller caller, + String runId, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return ref0().waitFor( + caller, + runId, + pollInterval: pollInterval, + timeout: timeout, + ); + } } diff --git a/packages/stem/lib/src/workflow/core/flow_context.dart b/packages/stem/lib/src/workflow/core/flow_context.dart index 33c8d866..fb1b3426 100644 --- a/packages/stem/lib/src/workflow/core/flow_context.dart +++ b/packages/stem/lib/src/workflow/core/flow_context.dart @@ -1,6 +1,11 @@ import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/flow_step.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; import 'package:stem/src/workflow/core/workflow_clock.dart'; +import 'package:stem/src/workflow/core/workflow_execution_context.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; /// Context provided to each workflow step invocation. /// @@ -13,7 +18,7 @@ import 'package:stem/src/workflow/core/workflow_clock.dart'; /// [iteration] indicates how many times the step has already completed when /// `autoVersion` is enabled, allowing handlers to branch per loop iteration or /// derive unique identifiers. -class FlowContext { +class FlowContext implements WorkflowExecutionContext { /// Creates a workflow step context. FlowContext({ required this.workflow, @@ -26,37 +31,72 @@ class FlowContext { WorkflowClock clock = const SystemWorkflowClock(), Object? resumeData, this.enqueuer, + this.workflows, }) : _clock = clock, _resumeData = resumeData; /// Name of the workflow. + @override final String workflow; /// Identifier of the workflow run. + @override final String runId; /// Name of the current step. + @override final String stepName; /// Parameters passed when the workflow was started. + @override final Map params; /// Result of the previous step, if any. + @override final Object? previousResult; /// Zero-based index of the current step. + @override final int stepIndex; /// Current iteration when auto-versioning is enabled. + @override final int iteration; /// Optional enqueuer for scheduling tasks with workflow metadata. + @override final TaskEnqueuer? enqueuer; + + /// Optional typed workflow caller for spawning child workflows. + @override + final WorkflowCaller? workflows; final WorkflowClock _clock; FlowStepControl? _control; Object? _resumeData; + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeFlowContextValue(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + /// Suspends the workflow until the delay elapses. /// /// After the delay, the worker replays the **same step** from the top. To @@ -89,6 +129,35 @@ class FlowContext { return _control!; } + /// Suspends the workflow for [duration] with a JSON-serializable DTO payload. + FlowStepControl sleepJson(Duration duration, T value, {String? typeName}) { + return sleep( + duration, + data: Map.from( + PayloadCodec.encodeJsonMap(value, typeName: typeName), + ), + ); + } + + /// Suspends the workflow for [duration] with a versioned DTO payload. + FlowStepControl sleepVersionedJson( + Duration duration, + T value, { + required int version, + String? typeName, + }) { + return sleep( + duration, + data: Map.from( + PayloadCodec.encodeVersionedJsonMap( + value, + version: version, + typeName: typeName, + ), + ), + ); + } + /// Suspends the workflow until an event with [topic] is emitted. /// /// When the event bus resumes the run, the payload is made available via @@ -107,6 +176,57 @@ class FlowContext { return _control!; } + /// Suspends the workflow until [topic] arrives with a DTO payload. + FlowStepControl awaitEventJson( + String topic, + T value, { + DateTime? deadline, + String? typeName, + }) { + return awaitEvent( + topic, + deadline: deadline, + data: Map.from( + PayloadCodec.encodeJsonMap(value, typeName: typeName), + ), + ); + } + + /// Suspends the workflow until [topic] arrives with a versioned DTO payload. + FlowStepControl awaitEventVersionedJson( + String topic, + T value, { + required int version, + DateTime? deadline, + String? typeName, + }) { + return awaitEvent( + topic, + deadline: deadline, + data: Map.from( + PayloadCodec.encodeVersionedJsonMap( + value, + version: version, + typeName: typeName, + ), + ), + ); + } + + @override + void suspendFor(Duration duration, {Map? data}) { + sleep(duration, data: data); + } + + @override + void waitForTopic( + String topic, { + DateTime? deadline, + Map? data, + }) { + awaitEvent(topic, deadline: deadline, data: data); + } + /// Injects a payload that will be returned the next time [takeResumeData] is /// called. Primarily used by the runtime; tests may also leverage it to mock /// resumption data. @@ -122,6 +242,7 @@ class FlowContext { /// The method consumes the payload so subsequent calls during the same step /// return `null`. This makes it safe to guard control-flow with a simple /// `if (takeResumeData() == null) { ... }` pattern. + @override Object? takeResumeData() { final value = _resumeData; _resumeData = null; @@ -139,6 +260,7 @@ class FlowContext { /// Returns a stable idempotency key derived from the workflow, run, and /// [scope]. Defaults to the current [stepName] (including iteration suffix /// when [iteration] > 0) when no scope is provided. + @override String idempotencyKey([String? scope]) { final defaultScope = iteration > 0 ? '$stepName#$iteration' : stepName; final effectiveScope = (scope == null || scope.isEmpty) @@ -146,4 +268,130 @@ class FlowContext { : scope; return '$workflow/$runId/$effectiveScope'; } + + /// Enqueues a task using the workflow-scoped enqueuer. + /// + /// Workflow metadata propagation is handled by the runtime-provided + /// enqueuer implementation. + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) async { + final delegate = enqueuer; + if (delegate == null) { + throw StateError('FlowContext has no enqueuer configured'); + } + return delegate.enqueue( + name, + args: args, + headers: headers, + meta: meta, + options: options, + notBefore: notBefore, + enqueueOptions: enqueueOptions, + ); + } + + /// Enqueues a typed task call using the workflow-scoped enqueuer. + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) async { + final delegate = enqueuer; + if (delegate == null) { + throw StateError('FlowContext has no enqueuer configured'); + } + return delegate.enqueueCall(call, enqueueOptions: enqueueOptions); + } + + /// Starts a typed child workflow using the workflow-scoped caller. + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) async { + final caller = workflows; + if (caller == null) { + throw StateError('FlowContext has no workflow caller configured'); + } + return caller.startWorkflowRef( + definition, + params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + /// Starts a prebuilt child workflow call using the workflow-scoped caller. + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) async { + final caller = workflows; + if (caller == null) { + throw StateError('FlowContext has no workflow caller configured'); + } + return caller.startWorkflowCall(call); + } + + /// Waits for a typed child workflow run using the workflow-scoped caller. + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) async { + final caller = workflows; + if (caller == null) { + throw StateError('FlowContext has no workflow caller configured'); + } + return caller.waitForWorkflowRef( + runId, + definition, + pollInterval: pollInterval, + timeout: timeout, + ); + } +} + +Map _encodeFlowContextValue( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'Task payload for $name must use string keys, got ' + '${key.runtimeType}.', + ); + } + normalized[key] = entry.value; + } + return normalized; + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); } diff --git a/packages/stem/lib/src/workflow/core/flow_step.dart b/packages/stem/lib/src/workflow/core/flow_step.dart index 9413d0bd..d7194624 100644 --- a/packages/stem/lib/src/workflow/core/flow_step.dart +++ b/packages/stem/lib/src/workflow/core/flow_step.dart @@ -50,6 +50,19 @@ class FlowStep { taskNames = List.unmodifiable(taskNames), metadata = metadata == null ? null : Map.unmodifiable(metadata); + /// Rehydrates a flow step from serialized JSON. + factory FlowStep.fromJson(Map json) { + return FlowStep( + name: json['name']?.toString() ?? '', + title: json['title']?.toString(), + kind: _kindFromJson(json['kind']), + taskNames: (json['taskNames'] as List?)?.cast() ?? const [], + autoVersion: json['autoVersion'] == true, + metadata: (json['metadata'] as Map?)?.cast(), + handler: (_) async {}, + ); + } + /// Creates a step definition backed by a typed [valueCodec]. static FlowStep typed({ required String name, @@ -74,19 +87,6 @@ class FlowStep { ); } - /// Rehydrates a flow step from serialized JSON. - factory FlowStep.fromJson(Map json) { - return FlowStep( - name: json['name']?.toString() ?? '', - title: json['title']?.toString(), - kind: _kindFromJson(json['kind']), - taskNames: (json['taskNames'] as List?)?.cast() ?? const [], - autoVersion: json['autoVersion'] == true, - metadata: (json['metadata'] as Map?)?.cast(), - handler: (_) async {}, - ); - } - /// Step name used for checkpoints and scheduling. final String name; @@ -182,6 +182,69 @@ class FlowStepControl { factory FlowStepControl.continueRun() => FlowStepControl._(FlowControlType.continueRun); + /// Suspend the run until [duration] elapses with a DTO payload. + static FlowStepControl sleepJson( + Duration duration, + T value, { + String? typeName, + }) => FlowStepControl.sleep( + duration, + data: Map.from( + PayloadCodec.encodeJsonMap(value, typeName: typeName), + ), + ); + + /// Suspend the run until [duration] elapses with a versioned DTO payload. + static FlowStepControl sleepVersionedJson( + Duration duration, + T value, { + required int version, + String? typeName, + }) => FlowStepControl.sleep( + duration, + data: Map.from( + PayloadCodec.encodeVersionedJsonMap( + value, + version: version, + typeName: typeName, + ), + ), + ); + + /// Suspend the run until an event with [topic] arrives with a DTO payload. + static FlowStepControl awaitTopicJson( + String topic, + T value, { + DateTime? deadline, + String? typeName, + }) => FlowStepControl.awaitTopic( + topic, + deadline: deadline, + data: Map.from( + PayloadCodec.encodeJsonMap(value, typeName: typeName), + ), + ); + + /// Suspend the run until an event with [topic] arrives with a versioned DTO + /// payload. + static FlowStepControl awaitTopicVersionedJson( + String topic, + T value, { + required int version, + DateTime? deadline, + String? typeName, + }) => FlowStepControl.awaitTopic( + topic, + deadline: deadline, + data: Map.from( + PayloadCodec.encodeVersionedJsonMap( + value, + version: version, + typeName: typeName, + ), + ), + ); + /// Control type emitted by the step. final FlowControlType type; @@ -196,6 +259,44 @@ class FlowStepControl { /// Additional data to persist with the suspension. final Map? data; + + /// Decodes the suspension metadata with [codec], when present. + TData? dataAs({required PayloadCodec codec}) { + final stored = data; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the suspension metadata with a JSON decoder, when present. + TData? dataJson({ + required TData Function(Map payload) decode, + String? typeName, + }) { + final stored = data; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the suspension metadata with a version-aware JSON decoder, when + /// present. + TData? dataVersionedJson({ + required int version, + required TData Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = data; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } } /// Enumerates the suspension control types. diff --git a/packages/stem/lib/src/workflow/core/run_state.dart b/packages/stem/lib/src/workflow/core/run_state.dart index 6b31b49c..777157d1 100644 --- a/packages/stem/lib/src/workflow/core/run_state.dart +++ b/packages/stem/lib/src/workflow/core/run_state.dart @@ -1,4 +1,5 @@ import 'package:stem/src/core/clock.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; import 'package:stem/src/workflow/core/workflow_runtime_metadata.dart'; import 'package:stem/src/workflow/core/workflow_status.dart'; @@ -75,16 +76,88 @@ class RunState { Map get workflowParams => WorkflowRunRuntimeMetadata.stripFromParams(params); + /// Decodes the workflow params payload with [codec]. + TParams paramsAs({required PayloadCodec codec}) { + return codec.decode(workflowParams); + } + + /// Decodes the workflow params payload with a JSON decoder. + TParams paramsJson({ + required TParams Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(workflowParams); + } + + /// Decodes the workflow params payload with a version-aware JSON decoder. + TParams paramsVersionedJson({ + required int version, + required TParams Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(workflowParams); + } + /// Run-scoped runtime metadata. WorkflowRunRuntimeMetadata get runtimeMetadata => WorkflowRunRuntimeMetadata.fromParams(params); + /// Parent workflow run identifier, if this run was started as a child. + String? get parentRunId => + params[workflowParentRunIdParamKey]?.toString(); + /// Timestamp when the workflow run was created. final DateTime createdAt; /// Final result payload when the run completes. final Object? result; + /// Decodes the final result payload with [codec]. + TResult? resultAs({required PayloadCodec codec}) { + final stored = result; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the final result payload with a JSON decoder. + TResult? resultJson({ + required TResult Function(Map payload) decode, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the final result payload with a version-aware JSON decoder. + TResult? resultVersionedJson({ + required int version, + required TResult Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Topic that the run is currently waiting on, if any. final String? waitTopic; @@ -94,6 +167,44 @@ class RunState { /// Last error payload recorded for the run. final Map? lastError; + /// Decodes the last error payload with [codec], when present. + TError? lastErrorAs({required PayloadCodec codec}) { + final payload = lastError; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the last error payload with a JSON decoder, when present. + TError? lastErrorJson({ + required TError Function(Map payload) decode, + String? typeName, + }) { + final payload = lastError; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the last error payload with a version-aware JSON decoder, when + /// present. + TError? lastErrorVersionedJson({ + required int version, + required TError Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = lastError; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + /// Suspension metadata stored for the waiting step. final Map? suspensionData; @@ -112,6 +223,85 @@ class RunState { /// Metadata recorded when the run is cancelled (automatic or manual). final Map? cancellationData; + /// Decodes the runtime metadata payload with [codec]. + TRuntime runtimeAs({required PayloadCodec codec}) { + return codec.decode(runtimeMetadata.toJson()); + } + + /// Decodes the runtime metadata payload with a JSON decoder. + TRuntime runtimeJson({ + required TRuntime Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(runtimeMetadata.toJson()); + } + + /// Decodes the runtime metadata payload with a version-aware JSON decoder. + TRuntime runtimeVersionedJson({ + required int version, + required TRuntime Function( + Map payload, + int version, + ) + decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(runtimeMetadata.toJson()); + } + + /// Decodes the cancellation payload with [codec], when present. + TCancellation? cancellationDataAs({ + required PayloadCodec codec, + }) { + final payload = cancellationData; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the cancellation payload with a JSON decoder, when present. + TCancellation? cancellationDataJson({ + required TCancellation Function(Map payload) decode, + String? typeName, + }) { + final payload = cancellationData; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the cancellation payload with a version-aware JSON decoder, when + /// present. + TCancellation? cancellationDataVersionedJson({ + required int version, + required TCancellation Function( + Map payload, + int version, + ) + decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = cancellationData; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + static const _unset = Object(); /// Whether the run is in a terminal state. @@ -154,6 +344,50 @@ class RunState { /// Resume payload delivered to the suspended run, when present. Object? get suspensionPayload => suspensionData?['payload']; + /// Decodes the suspension payload with [codec], when present. + TPayload? suspensionPayloadAs({ + required PayloadCodec codec, + }) { + final stored = suspensionPayload; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the suspension payload with a JSON decoder, when present. + TPayload? suspensionPayloadJson({ + required TPayload Function(Map payload) decode, + String? typeName, + }) { + final stored = suspensionPayload; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the suspension payload with a version-aware JSON decoder, when + /// present. + TPayload? suspensionPayloadVersionedJson({ + required int version, + required TPayload Function( + Map payload, + int version, + ) + decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = suspensionPayload; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Timestamp when a matching event was delivered for this suspension. DateTime? get suspensionDeliveredAt => _dateFromJson(suspensionData?['deliveredAt']); diff --git a/packages/stem/lib/src/workflow/core/workflow_checkpoint.dart b/packages/stem/lib/src/workflow/core/workflow_checkpoint.dart new file mode 100644 index 00000000..dfdeadcf --- /dev/null +++ b/packages/stem/lib/src/workflow/core/workflow_checkpoint.dart @@ -0,0 +1,118 @@ +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/workflow/core/flow_step.dart'; + +/// Declared script checkpoint metadata used for tooling and replay boundaries. +/// +/// Unlike [FlowStep], a [WorkflowCheckpoint] does not define execution logic. +/// Script workflows execute their `run(...)` body directly and use these +/// declarations only for manifests, introspection, encoding, and replay +/// metadata. +class WorkflowCheckpoint { + /// Creates declared checkpoint metadata for a script workflow. + WorkflowCheckpoint({ + required this.name, + this.autoVersion = false, + String? title, + Object? Function(Object? value)? valueEncoder, + Object? Function(Object? payload)? valueDecoder, + this.kind = WorkflowStepKind.task, + Iterable taskNames = const [], + Map? metadata, + }) : title = title ?? name, + _valueEncoder = valueEncoder, + _valueDecoder = valueDecoder, + taskNames = List.unmodifiable(taskNames), + metadata = metadata == null ? null : Map.unmodifiable(metadata); + + /// Rehydrates declared checkpoint metadata from serialized JSON. + factory WorkflowCheckpoint.fromJson(Map json) { + return WorkflowCheckpoint( + name: json['name']?.toString() ?? '', + title: json['title']?.toString(), + kind: _checkpointKindFromJson(json['kind']), + taskNames: (json['taskNames'] as List?)?.cast() ?? const [], + autoVersion: json['autoVersion'] == true, + metadata: (json['metadata'] as Map?)?.cast(), + ); + } + + /// Creates checkpoint metadata backed by a typed [valueCodec]. + static WorkflowCheckpoint typed({ + required String name, + required PayloadCodec valueCodec, + bool autoVersion = false, + String? title, + WorkflowStepKind kind = WorkflowStepKind.task, + Iterable taskNames = const [], + Map? metadata, + }) { + return WorkflowCheckpoint( + name: name, + autoVersion: autoVersion, + title: title, + valueEncoder: valueCodec.encodeDynamic, + valueDecoder: valueCodec.decodeDynamic, + kind: kind, + taskNames: taskNames, + metadata: metadata, + ); + } + + /// Checkpoint name used for persistence and replay. + final String name; + + /// Human-friendly checkpoint title exposed for introspection. + final String title; + + /// Checkpoint kind classification. + final WorkflowStepKind kind; + + final Object? Function(Object? value)? _valueEncoder; + final Object? Function(Object? payload)? _valueDecoder; + + /// Task names associated with this checkpoint. + final List taskNames; + + /// Optional metadata associated with the checkpoint. + final Map? metadata; + + /// Whether to auto-version repeated checkpoint executions. + final bool autoVersion; + + /// Serializes checkpoint metadata for workflow introspection. + Map toJson() { + return { + 'name': name, + 'title': title, + 'kind': kind.name, + 'taskNames': taskNames, + 'autoVersion': autoVersion, + if (metadata != null) 'metadata': metadata, + }; + } + + /// Encodes a checkpoint value before it is persisted. + Object? encodeValue(Object? value) { + if (value == null) return null; + final encoder = _valueEncoder; + if (encoder == null) return value; + return encoder(value); + } + + /// Decodes a persisted checkpoint value back into the author-facing type. + Object? decodeValue(Object? payload) { + if (payload == null) return null; + final decoder = _valueDecoder; + if (decoder == null) return payload; + return decoder(payload); + } +} + +WorkflowStepKind _checkpointKindFromJson(Object? value) { + final raw = value?.toString(); + if (raw == null || raw.isEmpty) return WorkflowStepKind.task; + return WorkflowStepKind.values.firstWhere( + (kind) => kind.name == raw, + orElse: () => WorkflowStepKind.task, + ); +} diff --git a/packages/stem/lib/src/workflow/core/workflow_definition.dart b/packages/stem/lib/src/workflow/core/workflow_definition.dart index 814d422c..fbd5f5ec 100644 --- a/packages/stem/lib/src/workflow/core/workflow_definition.dart +++ b/packages/stem/lib/src/workflow/core/workflow_definition.dart @@ -60,6 +60,8 @@ import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/flow.dart' show Flow; import 'package:stem/src/workflow/core/flow_context.dart'; import 'package:stem/src/workflow/core/flow_step.dart'; +import 'package:stem/src/workflow/core/workflow_checkpoint.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; import 'package:stem/src/workflow/core/workflow_script_context.dart'; import 'package:stem/src/workflow/workflow.dart' show Flow; import 'package:stem/stem.dart' show Flow; @@ -87,6 +89,7 @@ class WorkflowDefinition { required this.name, required WorkflowDefinitionKind kind, required List steps, + List checkpoints = const [], List edges = const [], this.version, this.description, @@ -96,6 +99,7 @@ class WorkflowDefinition { Object? Function(Object? payload)? resultDecoder, }) : _kind = kind, _steps = steps, + _checkpoints = checkpoints, _edges = edges, _resultEncoder = resultEncoder, _resultDecoder = resultDecoder, @@ -105,10 +109,19 @@ class WorkflowDefinition { factory WorkflowDefinition.fromJson(Map json) { final kind = _kindFromJson(json['kind']); final stepsJson = (json['steps'] as List?) ?? const []; - final steps = stepsJson - .whereType>() - .map(FlowStep.fromJson) - .toList(); + final steps = kind == WorkflowDefinitionKind.flow + ? stepsJson + .whereType>() + .map(FlowStep.fromJson) + .toList() + : []; + final checkpointsJson = (json['checkpoints'] as List?) ?? stepsJson; + final checkpoints = kind == WorkflowDefinitionKind.script + ? checkpointsJson + .whereType>() + .map(WorkflowCheckpoint.fromJson) + .toList() + : []; final edgesJson = (json['edges'] as List?) ?? const []; final edges = edgesJson .whereType>() @@ -119,6 +132,7 @@ class WorkflowDefinition { name: json['name']?.toString() ?? '', kind: kind, steps: steps, + checkpoints: checkpoints, edges: edges, version: json['version']?.toString(), description: json['description']?.toString(), @@ -129,6 +143,7 @@ class WorkflowDefinition { name: json['name']?.toString() ?? '', kind: kind, steps: steps, + checkpoints: checkpoints, edges: edges, version: json['version']?.toString(), description: json['description']?.toString(), @@ -144,7 +159,13 @@ class WorkflowDefinition { String? description, Map? metadata, PayloadCodec? resultCodec, + T Function(Map payload)? decodeResultJson, + String? resultTypeName, }) { + assert( + resultCodec == null || decodeResultJson == null, + 'Specify either resultCodec or decodeResultJson, not both.', + ); final steps = []; build(FlowBuilder(steps)); final edges = []; @@ -153,13 +174,17 @@ class WorkflowDefinition { } Object? Function(Object?)? resultEncoder; Object? Function(Object?)? resultDecoder; - if (resultCodec != null) { - resultEncoder = (Object? value) { - return resultCodec.encodeDynamic(value); - }; - resultDecoder = (Object? payload) { - return resultCodec.decodeDynamic(payload); - }; + final resolvedResultCodec = + resultCodec ?? + (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$T', + )); + if (resolvedResultCodec != null) { + resultEncoder = resolvedResultCodec.encodeDynamic; + resultDecoder = resolvedResultCodec.decodeDynamic; } return WorkflowDefinition._( name: name, @@ -174,32 +199,141 @@ class WorkflowDefinition { ); } + /// Creates a flow-based workflow definition whose final result uses a custom + /// payload codec. + factory WorkflowDefinition.flowCodec({ + required String name, + required void Function(FlowBuilder builder) build, + required PayloadCodec resultCodec, + String? version, + String? description, + Map? metadata, + }) { + return WorkflowDefinition.flow( + name: name, + build: build, + version: version, + description: description, + metadata: metadata, + resultCodec: resultCodec, + ); + } + + /// Creates a flow-based workflow definition whose final result is a DTO + /// backed by a JSON payload. + factory WorkflowDefinition.flowJson({ + required String name, + required void Function(FlowBuilder builder) build, + required T Function(Map payload) decodeResult, + String? version, + String? description, + Map? metadata, + String? resultTypeName, + }) { + return WorkflowDefinition.flow( + name: name, + build: build, + version: version, + description: description, + metadata: metadata, + decodeResultJson: decodeResult, + resultTypeName: resultTypeName, + ); + } + + /// Creates a flow-based workflow definition whose final result is a + /// versioned DTO-backed JSON payload. + factory WorkflowDefinition.flowVersionedJson({ + required String name, + required void Function(FlowBuilder builder) build, + required int version, + required T Function(Map payload, int version) decodeResult, + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowDefinition.flow( + name: name, + build: build, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedJson( + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a flow-based workflow definition whose final result is a + /// versioned custom map payload. + factory WorkflowDefinition.flowVersionedMap({ + required String name, + required void Function(FlowBuilder builder) build, + required Object? Function(T value) encodeResult, + required int version, + required T Function(Map payload, int version) decodeResult, + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowDefinition.flow( + name: name, + build: build, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedMap( + encode: encodeResult, + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + /// Creates a script-based workflow definition. factory WorkflowDefinition.script({ required String name, required WorkflowScriptBody run, - Iterable steps = const [], - Iterable checkpoints = const [], + Iterable checkpoints = const [], String? version, String? description, Map? metadata, PayloadCodec? resultCodec, + T Function(Map payload)? decodeResultJson, + String? resultTypeName, }) { - final declaredCheckpoints = checkpoints.isNotEmpty ? checkpoints : steps; + assert( + resultCodec == null || decodeResultJson == null, + 'Specify either resultCodec or decodeResultJson, not both.', + ); Object? Function(Object?)? resultEncoder; Object? Function(Object?)? resultDecoder; - if (resultCodec != null) { - resultEncoder = (Object? value) { - return resultCodec.encodeDynamic(value); - }; - resultDecoder = (Object? payload) { - return resultCodec.decodeDynamic(payload); - }; + final resolvedResultCodec = + resultCodec ?? + (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$T', + )); + if (resolvedResultCodec != null) { + resultEncoder = resolvedResultCodec.encodeDynamic; + resultDecoder = resolvedResultCodec.decodeDynamic; } return WorkflowDefinition._( name: name, kind: WorkflowDefinitionKind.script, - steps: List.unmodifiable(declaredCheckpoints), + steps: const [], + checkpoints: List.unmodifiable(checkpoints), version: version, description: description, metadata: metadata, @@ -209,10 +343,119 @@ class WorkflowDefinition { ); } + /// Creates a script-based workflow definition whose final result uses a + /// custom payload codec. + factory WorkflowDefinition.scriptCodec({ + required String name, + required WorkflowScriptBody run, + required PayloadCodec resultCodec, + Iterable checkpoints = const [], + String? version, + String? description, + Map? metadata, + }) { + return WorkflowDefinition.script( + name: name, + run: run, + checkpoints: checkpoints, + version: version, + description: description, + metadata: metadata, + resultCodec: resultCodec, + ); + } + + /// Creates a script-based workflow definition whose final result is a DTO + /// backed by a JSON payload. + factory WorkflowDefinition.scriptJson({ + required String name, + required WorkflowScriptBody run, + required T Function(Map payload) decodeResult, + Iterable checkpoints = const [], + String? version, + String? description, + Map? metadata, + String? resultTypeName, + }) { + return WorkflowDefinition.script( + name: name, + run: run, + checkpoints: checkpoints, + version: version, + description: description, + metadata: metadata, + decodeResultJson: decodeResult, + resultTypeName: resultTypeName, + ); + } + + /// Creates a script-based workflow definition whose final result is a + /// versioned custom map payload. + factory WorkflowDefinition.scriptVersionedMap({ + required String name, + required WorkflowScriptBody run, + required Object? Function(T value) encodeResult, + required int version, + required T Function(Map payload, int version) decodeResult, + Iterable checkpoints = const [], + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowDefinition.script( + name: name, + run: run, + checkpoints: checkpoints, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedMap( + encode: encodeResult, + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a script-based workflow definition whose final result is a + /// versioned DTO-backed JSON payload. + factory WorkflowDefinition.scriptVersionedJson({ + required String name, + required WorkflowScriptBody run, + required int version, + required T Function(Map payload, int version) decodeResult, + Iterable checkpoints = const [], + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowDefinition.script( + name: name, + run: run, + checkpoints: checkpoints, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedJson( + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + /// Workflow name used for registration and scheduling. final String name; final WorkflowDefinitionKind _kind; final List _steps; + final List _checkpoints; final List _edges; /// Optional version identifier for the workflow definition. @@ -233,13 +476,16 @@ class WorkflowDefinition { /// Ordered list of steps for flow-based workflows. List get steps => List.unmodifiable(_steps); + /// Declared checkpoints for script-based workflows. + List get checkpoints => List.unmodifiable(_checkpoints); + /// Directed edges describing the workflow graph. List get edges => List.unmodifiable(_edges); /// Whether this definition represents a script-based workflow. bool get isScript => _kind == WorkflowDefinitionKind.script; - /// Looks up a declared step/checkpoint by its base name. + /// Looks up a declared flow step by its base name. FlowStep? stepByName(String name) { for (final step in _steps) { if (step.name == name) { @@ -249,6 +495,16 @@ class WorkflowDefinition { return null; } + /// Looks up declared script checkpoint metadata by its base name. + WorkflowCheckpoint? checkpointByName(String name) { + for (final checkpoint in _checkpoints) { + if (checkpoint.name == name) { + return checkpoint; + } + } + return null; + } + /// Encodes a final workflow result before it is persisted. Object? encodeResult(Object? value) { if (value == null) return null; @@ -265,6 +521,155 @@ class WorkflowDefinition { return decoder(payload); } + /// Builds a typed [WorkflowRef] from this definition without repeating the + /// registered workflow name. + WorkflowRef ref({ + required Map Function(TParams params) encodeParams, + }) { + return WorkflowRef( + name: name, + encodeParams: encodeParams, + decodeResult: (payload) => decodeResult(payload) as T, + ); + } + + /// Builds a typed [WorkflowRef] backed by a DTO [paramsCodec]. + WorkflowRef refCodec({ + required PayloadCodec paramsCodec, + }) { + return WorkflowRef.codec( + name: name, + paramsCodec: paramsCodec, + decodeResult: (payload) => decodeResult(payload) as T, + ); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()`. + WorkflowRef refJson({ + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return WorkflowRef.json( + name: name, + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + decodeResult: + decodeResultJson == null && decodeResultVersionedJson == null + ? (payload) => decodeResult(payload) as T + : null, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()` and persist a schema [version] beside the payload. + WorkflowRef refVersionedJson({ + required int version, + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return WorkflowRef.versionedJson( + name: name, + version: version, + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + decodeResult: + decodeResultJson == null && decodeResultVersionedJson == null + ? (payload) => decodeResult(payload) as T + : null, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()` and decode versioned results through a reusable registry. + WorkflowRef refVersionedJsonRegistry({ + required int version, + required PayloadVersionRegistry resultRegistry, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return WorkflowRef.versionedJsonRegistry( + name: name, + version: version, + resultRegistry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for custom map params that persist a schema + /// [version] beside the payload. + WorkflowRef refVersionedMap({ + required Object? Function(TParams params) encodeParams, + required int version, + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return WorkflowRef.versionedMap( + name: name, + encodeParams: encodeParams, + version: version, + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + decodeResult: + decodeResultJson == null && decodeResultVersionedJson == null + ? (payload) => decodeResult(payload) as T + : null, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for custom map params that persist a schema + /// [version] and decode versioned results through a reusable registry. + WorkflowRef refVersionedMapRegistry({ + required Object? Function(TParams params) encodeParams, + required int version, + required PayloadVersionRegistry resultRegistry, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return WorkflowRef.versionedMapRegistry( + name: name, + encodeParams: encodeParams, + version: version, + resultRegistry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [NoArgsWorkflowRef] from this definition. + NoArgsWorkflowRef ref0() { + return NoArgsWorkflowRef( + name: name, + decodeResult: (payload) => decodeResult(payload) as T, + ); + } + /// Stable identifier derived from immutable workflow definition fields. String get stableId { final basis = StringBuffer() @@ -274,25 +679,43 @@ class WorkflowDefinition { ..write('|') ..write(version ?? '') ..write('|'); - for (final step in _steps) { - basis - ..write(step.name) - ..write(':') - ..write(step.kind.name) - ..write(':') - ..write(step.autoVersion ? '1' : '0') - ..write('|'); + if (isScript) { + for (final checkpoint in _checkpoints) { + basis + ..write(checkpoint.name) + ..write(':') + ..write(checkpoint.kind.name) + ..write(':') + ..write(checkpoint.autoVersion ? '1' : '0') + ..write('|'); + } + } else { + for (final step in _steps) { + basis + ..write(step.name) + ..write(':') + ..write(step.kind.name) + ..write(':') + ..write(step.autoVersion ? '1' : '0') + ..write('|'); + } } return _stableHexDigest(basis.toString()); } /// Serialize the workflow definition for introspection. Map toJson() { - final steps = >[]; + final serializedSteps = >[]; for (var i = 0; i < _steps.length; i += 1) { final step = _steps[i].toJson(); step['position'] = i; - steps.add(step); + serializedSteps.add(step); + } + final serializedCheckpoints = >[]; + for (var i = 0; i < _checkpoints.length; i += 1) { + final checkpoint = _checkpoints[i].toJson(); + checkpoint['position'] = i; + serializedCheckpoints.add(checkpoint); } return { 'name': name, @@ -300,7 +723,8 @@ class WorkflowDefinition { if (version != null) 'version': version, if (description != null) 'description': description, if (metadata != null) 'metadata': metadata, - 'steps': steps, + if (_steps.isNotEmpty) 'steps': serializedSteps, + if (_checkpoints.isNotEmpty) 'checkpoints': serializedCheckpoints, 'edges': _edges.map((edge) => edge.toJson()).toList(), }; } @@ -308,8 +732,10 @@ class WorkflowDefinition { String _stableHexDigest(String input) { final bytes = utf8.encode(input); + // FNV-1a uses this exact 64-bit offset basis; keep the literal stable. + // ignore: avoid_js_rounded_ints var hash = 0xcbf29ce484222325; - const prime = 0x00000100000001B3; + const prime = 0x100000001b3; for (final value in bytes) { hash ^= value; hash = (hash * prime) & 0xFFFFFFFFFFFFFFFF; diff --git a/packages/stem/lib/src/workflow/core/workflow_event_ref.dart b/packages/stem/lib/src/workflow/core/workflow_event_ref.dart new file mode 100644 index 00000000..00a41f06 --- /dev/null +++ b/packages/stem/lib/src/workflow/core/workflow_event_ref.dart @@ -0,0 +1,155 @@ +import 'package:stem/src/core/payload_codec.dart'; + +/// Shared typed workflow-event dispatch surface used by apps and runtimes. +abstract interface class WorkflowEventEmitter { + /// Emits a typed external event that serializes onto the durable map-based + /// workflow event transport. + Future emitValue( + String topic, + T value, { + PayloadCodec? codec, + }); + + /// Emits a typed external event using a [WorkflowEventRef]. + Future emitEvent(WorkflowEventRef event, T value); +} + +/// Typed reference to a workflow resume event topic. +/// +/// This bundles the durable topic name with an optional payload codec so +/// callers do not need to repeat a raw topic string and separate codec across +/// wait and emit sites. +class WorkflowEventRef { + /// Creates a typed workflow event reference. + const WorkflowEventRef({ + required this.topic, + this.codec, + }); + + /// Creates a typed workflow event reference for DTO payloads that already + /// expose `toJson()` and `Type.fromJson(...)`. + factory WorkflowEventRef.codec({ + required String topic, + required PayloadCodec codec, + }) { + return WorkflowEventRef( + topic: topic, + codec: codec, + ); + } + + /// Creates a typed workflow event reference for DTO payloads that already + /// expose `toJson()` and `Type.fromJson(...)`. + factory WorkflowEventRef.json({ + required String topic, + required T Function(Map payload) decode, + String? typeName, + }) { + return WorkflowEventRef.codec( + topic: topic, + codec: PayloadCodec.json( + decode: decode, + typeName: typeName, + ), + ); + } + + /// Creates a typed workflow event reference for DTO payloads that already + /// expose `toJson()` and persist a schema [version] beside the payload. + factory WorkflowEventRef.versionedJson({ + required String topic, + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return WorkflowEventRef.codec( + topic: topic, + codec: PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ), + ); + } + + /// Creates a typed workflow event reference backed by a reusable version + /// registry. + factory WorkflowEventRef.versionedJsonRegistry({ + required String topic, + required int version, + required PayloadVersionRegistry registry, + int? defaultDecodeVersion, + String? typeName, + }) { + return WorkflowEventRef.codec( + topic: topic, + codec: PayloadCodec.versionedJsonRegistry( + version: version, + registry: registry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ), + ); + } + + /// Creates a typed workflow event reference for custom map payloads that + /// persist a schema [version] beside the payload. + factory WorkflowEventRef.versionedMap({ + required String topic, + required Object? Function(T value) encode, + required int version, + required T Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return WorkflowEventRef.codec( + topic: topic, + codec: PayloadCodec.versionedMap( + encode: encode, + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ), + ); + } + + /// Creates a typed workflow event reference for custom map payloads backed + /// by a reusable version registry. + factory WorkflowEventRef.versionedMapRegistry({ + required String topic, + required Object? Function(T value) encode, + required int version, + required PayloadVersionRegistry registry, + int? defaultDecodeVersion, + String? typeName, + }) { + return WorkflowEventRef.codec( + topic: topic, + codec: PayloadCodec.versionedMapRegistry( + encode: encode, + version: version, + registry: registry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ), + ); + } + + /// Durable topic name used to suspend and resume workflow runs. + final String topic; + + /// Optional codec for encoding and decoding event payloads. + final PayloadCodec? codec; + +} + +/// Convenience helpers for dispatching typed workflow events. +extension WorkflowEventRefExtension on WorkflowEventRef { + /// Emits this typed event with the provided [emitter]. + Future emit(WorkflowEventEmitter emitter, T value) { + return emitter.emitEvent(this, value); + } +} diff --git a/packages/stem/lib/src/workflow/core/workflow_execution_context.dart b/packages/stem/lib/src/workflow/core/workflow_execution_context.dart new file mode 100644 index 00000000..b5e470a9 --- /dev/null +++ b/packages/stem/lib/src/workflow/core/workflow_execution_context.dart @@ -0,0 +1,412 @@ +import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_resume_context.dart'; + +/// Shared execution context surface for flow steps and script checkpoints. +/// +/// This keeps the common workflow-authoring capabilities on one type: +/// metadata about the current step/checkpoint, task enqueueing, child-workflow +/// starts, and durable suspension helpers. +abstract interface class WorkflowExecutionContext + implements TaskEnqueuer, WorkflowCaller, WorkflowResumeContext { + /// Name of the workflow currently executing. + String get workflow; + + /// Identifier for the workflow run. + String get runId; + + /// Name of the current step or checkpoint. + String get stepName; + + /// Zero-based step or checkpoint index. + int get stepIndex; + + /// Iteration count for looped steps or checkpoints. + int get iteration; + + /// Parameters provided when the workflow started. + Map get params; + + /// Result of the previous step or checkpoint, if any. + Object? get previousResult; + + /// Returns a stable idempotency key derived from workflow/run/step state. + String idempotencyKey([String? scope]); + + /// Optional enqueuer for scheduling tasks with workflow metadata. + TaskEnqueuer? get enqueuer; + + /// Optional typed workflow caller for spawning child workflows. + WorkflowCaller? get workflows; +} + +/// Typed read helpers for workflow start parameters. +extension WorkflowExecutionContextParams on WorkflowExecutionContext { + /// Decodes the full workflow start-parameter payload through [codec]. + T paramsAs({required PayloadCodec codec}) { + return codec.decode(params); + } + + /// Decodes the full workflow start-parameter payload as a DTO. + T paramsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(params); + } + + /// Decodes the full workflow start-parameter payload as a version-aware + /// DTO. + T paramsVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(params); + } + + /// Returns the decoded workflow parameter for [key], or `null`. + T? param(String key, {PayloadCodec? codec}) { + return params.value(key, codec: codec); + } + + /// Returns the decoded workflow parameter for [key], or [fallback]. + T paramOr(String key, T fallback, {PayloadCodec? codec}) { + return params.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded workflow parameter for [key], throwing when absent. + T requiredParam(String key, {PayloadCodec? codec}) { + return params.requiredValue(key, codec: codec); + } + + /// Returns the decoded workflow parameter DTO for [key], or `null`. + T? paramJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO for [key], or [fallback]. + T paramJsonOr( + String key, + T fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueJsonOr( + key, + fallback, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO for [key], throwing when + /// absent. + T requiredParamJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.requiredValueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO for [key], or + /// `null`. + T? paramVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO for [key], or + /// [fallback]. + T paramVersionedJsonOr( + String key, + T fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueVersionedJsonOr( + key, + fallback, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO for [key], + /// throwing when absent. + T requiredParamVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.requiredValueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO list for [key], or `null`. + List? paramListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueListJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO list for [key], or [fallback]. + List paramListJsonOr( + String key, + List fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueListJsonOr( + key, + fallback, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO list for [key], throwing when + /// absent. + List requiredParamListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.requiredValueListJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO list for [key], + /// or `null`. + List? paramListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO list for [key], + /// or [fallback]. + List paramListVersionedJsonOr( + String key, + List fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueListVersionedJsonOr( + key, + fallback, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO list for [key], + /// throwing when absent. + List requiredParamListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.requiredValueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } +} + +/// Typed read helpers for prior workflow step and checkpoint values. +extension WorkflowExecutionContextValues on WorkflowExecutionContext { + /// Returns the decoded prior step/checkpoint value as [T], or `null`. + /// + /// When [codec] is supplied, a non-`T` durable payload is decoded through + /// that codec before being returned. + T? previousValue({PayloadCodec? codec}) { + final value = previousResult; + if (value == null) return null; + if (codec != null && value is! T) { + return codec.decodeDynamic(value) as T; + } + return value as T; + } + + /// Returns the decoded prior step/checkpoint value as [T], throwing when the + /// workflow does not yet have a previous result. + T requiredPreviousValue({PayloadCodec? codec}) { + final value = previousValue(codec: codec); + if (value == null) { + throw StateError('WorkflowExecutionContext.previousResult is null.'); + } + return value; + } + + /// Returns the decoded prior step/checkpoint value as a typed DTO, or + /// `null`. + T? previousJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final value = previousResult; + if (value == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(value); + } + + /// Returns the decoded prior step/checkpoint DTO, or [fallback]. + T previousJsonOr( + T fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return previousJson( + decode: decode, + typeName: typeName, + ) ?? + fallback; + } + + /// Returns the decoded prior step/checkpoint DTO, throwing when absent. + T requiredPreviousJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final value = previousJson( + decode: decode, + typeName: typeName, + ); + if (value == null) { + throw StateError('WorkflowExecutionContext.previousResult is null.'); + } + return value; + } + + /// Returns the decoded prior step/checkpoint value as a versioned typed DTO, + /// or `null`. + T? previousVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final value = previousResult; + if (value == null) return null; + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(value); + } + + /// Returns the decoded prior step/checkpoint versioned DTO, or [fallback]. + T previousVersionedJsonOr( + T fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return previousVersionedJson( + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ) ?? + fallback; + } + + /// Returns the decoded prior step/checkpoint versioned DTO, throwing when + /// absent. + T requiredPreviousVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final value = previousVersionedJson( + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + if (value == null) { + throw StateError('WorkflowExecutionContext.previousResult is null.'); + } + return value; + } +} diff --git a/packages/stem/lib/src/workflow/core/workflow_ref.dart b/packages/stem/lib/src/workflow/core/workflow_ref.dart index c6fc2f70..8097d022 100644 --- a/packages/stem/lib/src/workflow/core/workflow_ref.dart +++ b/packages/stem/lib/src/workflow/core/workflow_ref.dart @@ -1,4 +1,6 @@ +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; /// Typed producer-facing reference to a registered workflow. /// @@ -13,6 +15,215 @@ class WorkflowRef { this.decodeResult, }); + /// Creates a typed workflow reference backed by payload codecs. + factory WorkflowRef.codec({ + required String name, + required PayloadCodec paramsCodec, + PayloadCodec? resultCodec, + TResult Function(Object? payload)? decodeResult, + }) { + return WorkflowRef( + name: name, + encodeParams: (params) => _encodeCodecParams(name, paramsCodec, params), + decodeResult: decodeResult ?? resultCodec?.decode, + ); + } + + /// Creates a typed workflow reference for DTO params that already expose + /// `toJson()`. + factory WorkflowRef.json({ + required String name, + TResult Function(Map payload)? decodeResultJson, + TResult Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + TResult Function(Object? payload)? decodeResult, + String? paramsTypeName, + String? resultTypeName, + }) { + assert( + decodeResultJson == null || decodeResultVersionedJson == null, + 'Specify either decodeResultJson or decodeResultVersionedJson, not both.', + ); + final resultCodec = + decodeResultVersionedJson != null + ? PayloadCodec.versionedJson( + version: defaultDecodeVersion ?? 1, + decode: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ) + : (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$TResult', + )); + return WorkflowRef( + name: name, + encodeParams: (params) => + _encodeJsonParams(params, paramsTypeName ?? '$TParams'), + decodeResult: decodeResult ?? resultCodec?.decode, + ); + } + + /// Creates a typed workflow reference for DTO params that already expose + /// `toJson()` and persist a schema [version] beside the payload. + factory WorkflowRef.versionedJson({ + required String name, + required int version, + TResult Function(Map payload)? decodeResultJson, + TResult Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + TResult Function(Object? payload)? decodeResult, + String? paramsTypeName, + String? resultTypeName, + }) { + assert( + decodeResultJson == null || decodeResultVersionedJson == null, + 'Specify either decodeResultJson or decodeResultVersionedJson, not both.', + ); + final resultCodec = + decodeResultVersionedJson != null + ? PayloadCodec.versionedJson( + version: version, + decode: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ) + : (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$TResult', + )); + return WorkflowRef( + name: name, + encodeParams: (params) => _encodeVersionedJsonParams( + params, + version: version, + typeName: paramsTypeName ?? '$TParams', + ), + decodeResult: decodeResult ?? resultCodec?.decode, + ); + } + + /// Creates a typed workflow reference for DTO params that already expose + /// `toJson()` and decode versioned results through a reusable registry. + factory WorkflowRef.versionedJsonRegistry({ + required String name, + required int version, + required PayloadVersionRegistry resultRegistry, + TResult Function(Object? payload)? decodeResult, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + final resultCodec = PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ); + return WorkflowRef( + name: name, + encodeParams: (params) => _encodeVersionedJsonParams( + params, + version: version, + typeName: paramsTypeName ?? '$TParams', + ), + decodeResult: decodeResult ?? resultCodec.decode, + ); + } + + /// Creates a typed workflow reference for custom map params that persist a + /// schema [version] beside the payload. + factory WorkflowRef.versionedMap({ + required String name, + required Object? Function(TParams params) encodeParams, + required int version, + TResult Function(Map payload)? decodeResultJson, + TResult Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + TResult Function(Object? payload)? decodeResult, + String? paramsTypeName, + String? resultTypeName, + }) { + assert( + decodeResultJson == null || decodeResultVersionedJson == null, + 'Specify either decodeResultJson or decodeResultVersionedJson, not both.', + ); + final paramsCodec = PayloadCodec.versionedMap( + encode: encodeParams, + version: version, + decode: (payload, _) => throw UnsupportedError( + 'WorkflowRef.versionedMap($name) only uses the params codec for ' + 'encoding. Decoding is not supported at the ref layer.', + ), + defaultDecodeVersion: defaultDecodeVersion, + typeName: paramsTypeName ?? '$TParams', + ); + final resultCodec = + decodeResultVersionedJson != null + ? PayloadCodec.versionedJson( + version: version, + decode: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ) + : (decodeResultJson == null + ? null + : PayloadCodec.json( + decode: decodeResultJson, + typeName: resultTypeName ?? '$TResult', + )); + return WorkflowRef.codec( + name: name, + paramsCodec: paramsCodec, + resultCodec: resultCodec, + decodeResult: decodeResult, + ); + } + + /// Creates a typed workflow reference for custom map params that persist a + /// schema [version] and decode versioned results through a reusable + /// registry. + factory WorkflowRef.versionedMapRegistry({ + required String name, + required Object? Function(TParams params) encodeParams, + required int version, + required PayloadVersionRegistry resultRegistry, + TResult Function(Object? payload)? decodeResult, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + final paramsCodec = PayloadCodec.versionedMap( + encode: encodeParams, + version: version, + decode: (payload, _) => throw UnsupportedError( + 'WorkflowRef.versionedMapRegistry($name) only uses the params codec ' + 'for encoding. Decoding is not supported at the ref layer.', + ), + defaultDecodeVersion: defaultDecodeVersion, + typeName: paramsTypeName ?? '$TParams', + ); + final resultCodec = PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$TResult', + ); + return WorkflowRef.codec( + name: name, + paramsCodec: paramsCodec, + resultCodec: resultCodec, + decodeResult: decodeResult, + ); + } + /// Registered workflow name. final String name; @@ -22,20 +233,54 @@ class WorkflowRef { /// Optional decoder for the final workflow result payload. final TResult Function(Object? payload)? decodeResult; - /// Builds a workflow start call from typed arguments. - WorkflowStartCall call( - TParams params, { - String? parentRunId, - Duration? ttl, - WorkflowCancellationPolicy? cancellationPolicy, + static Map _encodeCodecParams( + String workflowName, + PayloadCodec codec, + T params, + ) { + final payload = codec.encode(params); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'WorkflowRef.codec($workflowName) requires payload ' + 'keys to be strings, got ${key.runtimeType}.', + ); + } + normalized[key] = entry.value; + } + return normalized; + } + throw StateError( + 'WorkflowRef.codec($workflowName) must encode params to ' + 'Map, got ${payload.runtimeType}.', + ); + } + + static Map _encodeJsonParams(T params, String typeName) { + final payload = PayloadCodec.encodeJsonMap( + params, + typeName: typeName, + ); + return Map.from(payload); + } + + static Map _encodeVersionedJsonParams( + T params, { + required int version, + required String typeName, }) { - return WorkflowStartCall._( - definition: this, - params: params, - parentRunId: parentRunId, - ttl: ttl, - cancellationPolicy: cancellationPolicy, + final payload = PayloadCodec.encodeVersionedJsonMap( + params, + version: version, + typeName: typeName, ); + return Map.from(payload); } /// Decodes a final workflow result payload. @@ -49,6 +294,170 @@ class WorkflowRef { } return payload as TResult; } + + /// Starts this workflow ref directly with [caller] using named args. + Future start( + WorkflowCaller caller, { + required TParams params, + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + return caller.startWorkflowCall( + buildStart( + params: params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ), + ); + } + + /// Starts this workflow ref with [caller] and waits for the result using + /// named args. + Future?> startAndWait( + WorkflowCaller caller, { + required TParams params, + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + final call = buildStart( + params: params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + return caller.startWorkflowCall(call).then((runId) { + return call.definition.waitFor( + caller, + runId, + pollInterval: pollInterval, + timeout: timeout, + ); + }); + } + + /// Builds an explicit [WorkflowStartCall] for this workflow ref. + WorkflowStartCall buildStart({ + required TParams params, + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + return WorkflowStartCall._( + definition: this, + params: params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } +} + +/// Typed producer-facing reference for workflows that take no input params. +class NoArgsWorkflowRef { + /// Creates a typed workflow reference for workflows without input params. + const NoArgsWorkflowRef({required this.name, this.decodeResult}); + + /// Registered workflow name. + final String name; + + /// Optional decoder for the final workflow result payload. + final TResult Function(Object? payload)? decodeResult; + + WorkflowRef<(), TResult> get _inner => WorkflowRef<(), TResult>( + name: name, + encodeParams: _encodeParams, + decodeResult: decodeResult, + ); + + /// Returns the underlying typed workflow ref used for waiting and dispatch. + WorkflowRef<(), TResult> get asRef => _inner; + + static Map _encodeParams(() _) => const {}; + + /// Starts this workflow ref directly with [caller]. + Future start( + WorkflowCaller caller, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + return asRef.start( + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + caller, + params: (), + ); + } + + /// Starts this workflow ref with [caller] and waits for the result. + Future?> startAndWait( + WorkflowCaller caller, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return asRef.startAndWait( + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + caller, + params: (), + pollInterval: pollInterval, + timeout: timeout, + ); + } + + /// Decodes a final workflow result payload. + TResult decode(Object? payload) => asRef.decode(payload); + + /// Waits for [runId] using this workflow reference's decode rules. + Future?> waitFor( + WorkflowCaller caller, + String runId, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return asRef.waitFor( + caller, + runId, + pollInterval: pollInterval, + timeout: timeout, + ); + } +} + +/// Shared typed workflow-start surface used by apps, runtimes, and contexts. +abstract interface class WorkflowCaller { + /// Starts a workflow from a typed [WorkflowRef]. + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }); + + /// Starts a workflow from a prebuilt [WorkflowStartCall]. + Future startWorkflowCall( + WorkflowStartCall call, + ); + + /// Waits for [runId] using the decoding rules from a [WorkflowRef]. + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }); } /// Typed start request built from a [WorkflowRef]. @@ -81,4 +490,25 @@ class WorkflowStartCall { /// Encodes typed parameters into the workflow parameter map. Map encodeParams() => definition.encodeParams(params); + +} + +/// Convenience helpers for waiting on typed workflow refs using a generic +/// [WorkflowCaller]. +extension WorkflowRefExtension + on WorkflowRef { + /// Waits for [runId] using this workflow reference's decode rules. + Future?> waitFor( + WorkflowCaller caller, + String runId, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return caller.waitForWorkflowRef( + runId, + this, + pollInterval: pollInterval, + timeout: timeout, + ); + } } diff --git a/packages/stem/lib/src/workflow/core/workflow_result.dart b/packages/stem/lib/src/workflow/core/workflow_result.dart index 486d2928..00efee93 100644 --- a/packages/stem/lib/src/workflow/core/workflow_result.dart +++ b/packages/stem/lib/src/workflow/core/workflow_result.dart @@ -1,4 +1,5 @@ import 'package:stem/src/bootstrap/workflow_app.dart' show StemWorkflowApp; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/run_state.dart'; import 'package:stem/src/workflow/core/workflow_status.dart'; import 'package:stem/stem.dart' show StemWorkflowApp; @@ -46,6 +47,58 @@ class WorkflowResult { /// run completed successfully. final T? value; + /// Returns [value] or [fallback] when the workflow has no decoded result. + T valueOr(T fallback) => value ?? fallback; + + /// Returns the decoded value, throwing when it is absent. + T requiredValue() { + final resolved = value; + if (resolved == null) { + throw StateError( + "Workflow run '$runId' does not have a decoded result value.", + ); + } + return resolved; + } + + /// Decodes the raw persisted workflow result with [codec]. + TResult? payloadAs({required PayloadCodec codec}) { + final stored = rawResult; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the raw persisted workflow result with a JSON decoder. + TResult? payloadJson({ + required TResult Function(Map payload) decode, + String? typeName, + }) { + final stored = rawResult; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the raw persisted workflow result with a version-aware JSON + /// decoder. + TResult? payloadVersionedJson({ + required int version, + required TResult Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = rawResult; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Untyped payload stored by the workflow, useful for legacy consumers or /// debugging scenarios. final Object? rawResult; diff --git a/packages/stem/lib/src/workflow/core/workflow_resume.dart b/packages/stem/lib/src/workflow/core/workflow_resume.dart index 8e2c7f18..94a2ee14 100644 --- a/packages/stem/lib/src/workflow/core/workflow_resume.dart +++ b/packages/stem/lib/src/workflow/core/workflow_resume.dart @@ -1,9 +1,22 @@ +import 'dart:async'; + import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/flow_context.dart'; -import 'package:stem/src/workflow/core/workflow_script_context.dart'; +import 'package:stem/src/workflow/core/flow_step.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; +import 'package:stem/src/workflow/core/workflow_resume_context.dart'; + +/// Internal control-flow signal used by expression-style suspension helpers. +/// +/// The workflow runtime catches this after a helper has already scheduled the +/// corresponding sleep or event wait directive. User code should not catch it. +class WorkflowSuspensionSignal implements Exception { + /// Creates a durable suspension control-flow signal. + const WorkflowSuspensionSignal(); +} /// Typed resume helpers for durable workflow suspensions. -extension FlowContextResumeValues on FlowContext { +extension WorkflowResumeContextValues on WorkflowResumeContext { /// Returns the next resume payload as [T] and consumes it. /// /// When [codec] is provided, the stored durable payload is decoded through @@ -14,18 +27,270 @@ extension FlowContextResumeValues on FlowContext { if (codec != null) return codec.decodeDynamic(payload) as T; return payload as T; } -} -/// Typed resume helpers for durable script checkpoints. -extension WorkflowScriptStepResumeValues on WorkflowScriptStepContext { - /// Returns the next resume payload as [T] and consumes it. - /// - /// When [codec] is provided, the stored durable payload is decoded through - /// that codec before being returned. - T? takeResumeValue({PayloadCodec? codec}) { + /// Returns the next resume payload as a typed DTO and consumes it. + T? takeResumeJson({ + required T Function(Map payload) decode, + String? typeName, + }) { final payload = takeResumeData(); if (payload == null) return null; - if (codec != null) return codec.decodeDynamic(payload) as T; - return payload as T; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Returns the next resume payload as a versioned typed DTO and consumes it. + T? takeResumeVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = takeResumeData(); + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(payload); + } + + /// Suspends the current step on the first invocation and + /// returns `true` once the step resumes. + /// + /// This helper is for the common: + /// + /// ```dart + /// if (ctx.takeResumeData() == null) { + /// ctx.sleep(duration); + /// return null; + /// } + /// ``` + /// + /// pattern. + bool sleepUntilResumed( + Duration duration, { + Map? data, + }) { + final resume = takeResumeData(); + if (resume != null) { + return true; + } + final pending = suspendFor(duration, data: data); + if (pending is Future) { + unawaited(pending); + } + return false; + } + + /// Suspends once for [duration] and resumes by replaying the same step. + /// + /// This enables expression-style flow logic: + /// + /// ```dart + /// await ctx.sleepFor(duration: const Duration(seconds: 1)); + /// ``` + Future sleepFor({ + required Duration duration, + Map? data, + }) async { + final resume = takeResumeData(); + if (resume != null) { + return; + } + await suspendFor(duration, data: data); + throw const WorkflowSuspensionSignal(); + } + + /// Returns the next event payload as [T] when the step has resumed, or + /// registers an event wait and returns `null` on the first invocation. + /// + /// This helper is for the common: + /// + /// ```dart + /// final payload = ctx.takeResumeValue(codec: codec); + /// if (payload == null) { + /// ctx.awaitEvent(topic); + /// return null; + /// } + /// ``` + /// + /// pattern. + T? waitForEventValue( + String topic, { + DateTime? deadline, + Map? data, + PayloadCodec? codec, + }) { + final payload = takeResumeValue(codec: codec); + if (payload != null) { + return payload; + } + final pending = waitForTopic(topic, deadline: deadline, data: data); + if (pending is Future) { + unawaited(pending); + } + return null; + } + + /// Returns the next event payload as a typed DTO when the step has resumed, + /// or registers an event wait and returns `null` on the first invocation. + T? waitForEventValueJson( + String topic, { + required T Function(Map payload) decode, + DateTime? deadline, + Map? data, + String? typeName, + }) { + final payload = takeResumeJson( + decode: decode, + typeName: typeName, + ); + if (payload != null) { + return payload; + } + final pending = waitForTopic(topic, deadline: deadline, data: data); + if (pending is Future) { + unawaited(pending); + } + return null; + } + + /// Returns the next event payload as a versioned typed DTO when the step has + /// resumed, or registers an event wait and returns `null` on the first + /// invocation. + T? waitForEventValueVersionedJson( + String topic, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + DateTime? deadline, + Map? data, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = takeResumeVersionedJson( + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + if (payload != null) { + return payload; + } + final pending = waitForTopic(topic, deadline: deadline, data: data); + if (pending is Future) { + unawaited(pending); + } + return null; + } + + /// Suspends until [topic] is emitted, then returns the resumed payload. + Future waitForEvent({ + required String topic, + DateTime? deadline, + Map? data, + PayloadCodec? codec, + }) async { + final payload = takeResumeValue(codec: codec); + if (payload != null) { + return payload; + } + await waitForTopic(topic, deadline: deadline, data: data); + throw const WorkflowSuspensionSignal(); + } + + /// Suspends until [topic] is emitted, then returns the resumed DTO payload. + Future waitForEventJson({ + required String topic, + required T Function(Map payload) decode, + DateTime? deadline, + Map? data, + String? typeName, + }) async { + final payload = takeResumeJson( + decode: decode, + typeName: typeName, + ); + if (payload != null) { + return payload; + } + await waitForTopic(topic, deadline: deadline, data: data); + throw const WorkflowSuspensionSignal(); + } + + /// Suspends until [topic] is emitted, then returns the resumed versioned DTO + /// payload. + Future waitForEventVersionedJson({ + required String topic, + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + DateTime? deadline, + Map? data, + int? defaultDecodeVersion, + String? typeName, + }) async { + final payload = takeResumeVersionedJson( + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + if (payload != null) { + return payload; + } + await waitForTopic(topic, deadline: deadline, data: data); + throw const WorkflowSuspensionSignal(); + } +} + +/// Direct typed wait helpers on [WorkflowEventRef]. +/// +/// These mirror `event.emit(...)` so typed workflow events can stay on the +/// event-ref surface for both emit and wait paths. +extension WorkflowEventRefWaitExtension on WorkflowEventRef { + /// Registers a low-level flow-control wait while keeping the typed event ref + /// on the call site. + FlowStepControl awaitOn( + FlowContext step, { + DateTime? deadline, + Map? data, + }) { + return step.awaitEvent( + topic, + deadline: deadline, + data: data, + ); + } + + /// Registers an event wait and returns the resumed payload on the legacy + /// null-then-resume path. + T? waitValue( + WorkflowResumeContext waiter, { + DateTime? deadline, + Map? data, + }) { + return waiter.waitForEventValue( + topic, + deadline: deadline, + data: data, + codec: codec, + ); + } + + /// Suspends until this event is emitted, then returns the decoded payload. + Future wait( + WorkflowResumeContext waiter, { + DateTime? deadline, + Map? data, + }) { + return waiter.waitForEvent( + topic: topic, + deadline: deadline, + data: data, + codec: codec, + ); } } diff --git a/packages/stem/lib/src/workflow/core/workflow_resume_context.dart b/packages/stem/lib/src/workflow/core/workflow_resume_context.dart new file mode 100644 index 00000000..cc1af913 --- /dev/null +++ b/packages/stem/lib/src/workflow/core/workflow_resume_context.dart @@ -0,0 +1,24 @@ +import 'dart:async'; + +/// Shared suspension/resume surface implemented by flow steps and script +/// checkpoints. +/// +/// This keeps typed event wait helpers on a single workflow-facing capability +/// instead of accepting an erased `Object` and branching at runtime. +abstract interface class WorkflowResumeContext { + /// Returns and clears the resume payload supplied by the runtime. + Object? takeResumeData(); + + /// Schedules a durable wake-up after [duration]. + FutureOr suspendFor( + Duration duration, { + Map? data, + }); + + /// Suspends until [topic] is emitted. + FutureOr waitForTopic( + String topic, { + DateTime? deadline, + Map? data, + }); +} diff --git a/packages/stem/lib/src/workflow/core/workflow_runtime_metadata.dart b/packages/stem/lib/src/workflow/core/workflow_runtime_metadata.dart index ff9e15a9..24b9db3f 100644 --- a/packages/stem/lib/src/workflow/core/workflow_runtime_metadata.dart +++ b/packages/stem/lib/src/workflow/core/workflow_runtime_metadata.dart @@ -3,6 +3,9 @@ import 'dart:collection'; /// Reserved params key storing internal runtime metadata for workflow runs. const String workflowRuntimeMetadataParamKey = '__stem.workflow.runtime'; +/// Reserved params key storing the parent workflow run identifier. +const String workflowParentRunIdParamKey = '__stem.workflow.parentRunId'; + /// Logical channel used by workflow-related task enqueues. enum WorkflowChannelKind { /// Orchestration channel used by workflow continuation tasks. @@ -51,36 +54,30 @@ class WorkflowRunRuntimeMetadata { factory WorkflowRunRuntimeMetadata.fromJson(Map json) { return WorkflowRunRuntimeMetadata( workflowId: json['workflowId']?.toString() ?? '', - orchestrationQueue: - json['orchestrationQueue']?.toString().trim().isNotEmpty == true - ? json['orchestrationQueue']!.toString().trim() - : 'workflow', - continuationQueue: - json['continuationQueue']?.toString().trim().isNotEmpty == true - ? json['continuationQueue']!.toString().trim() - : 'workflow', - executionQueue: - json['executionQueue']?.toString().trim().isNotEmpty == true - ? json['executionQueue']!.toString().trim() - : 'default', - serializationFormat: - json['serializationFormat']?.toString().trim().isNotEmpty == true - ? json['serializationFormat']!.toString().trim() - : 'json', - serializationVersion: - json['serializationVersion']?.toString().trim().isNotEmpty == true - ? json['serializationVersion']!.toString().trim() - : '1', - frameFormat: json['frameFormat']?.toString().trim().isNotEmpty == true - ? json['frameFormat']!.toString().trim() - : 'json-frame', - frameVersion: json['frameVersion']?.toString().trim().isNotEmpty == true - ? json['frameVersion']!.toString().trim() - : '1', - encryptionScope: - json['encryptionScope']?.toString().trim().isNotEmpty == true - ? json['encryptionScope']!.toString().trim() - : 'none', + orchestrationQueue: _stringOrDefault( + json, + 'orchestrationQueue', + 'workflow', + ), + continuationQueue: _stringOrDefault( + json, + 'continuationQueue', + 'workflow', + ), + executionQueue: _stringOrDefault(json, 'executionQueue', 'default'), + serializationFormat: _stringOrDefault( + json, + 'serializationFormat', + 'json', + ), + serializationVersion: _stringOrDefault( + json, + 'serializationVersion', + '1', + ), + frameFormat: _stringOrDefault(json, 'frameFormat', 'json-frame'), + frameVersion: _stringOrDefault(json, 'frameVersion', '1'), + encryptionScope: _stringOrDefault(json, 'encryptionScope', 'none'), encryptionEnabled: json['encryptionEnabled'] == true, streamId: json['streamId']?.toString(), ); @@ -151,20 +148,38 @@ class WorkflowRunRuntimeMetadata { } /// Returns a new params map containing this metadata under the reserved key. - Map attachToParams(Map params) { + Map attachToParams( + Map params, { + String? parentRunId, + }) { return Map.unmodifiable({ ...params, workflowRuntimeMetadataParamKey: toJson(), + if (parentRunId != null && parentRunId.isNotEmpty) + workflowParentRunIdParamKey: parentRunId, }); } /// Returns params without internal runtime metadata. static Map stripFromParams(Map params) { if (!params.containsKey(workflowRuntimeMetadataParamKey)) { - return Map.unmodifiable(params); + if (!params.containsKey(workflowParentRunIdParamKey)) { + return Map.unmodifiable(params); + } } final copy = Map.from(params) - ..remove(workflowRuntimeMetadataParamKey); + ..remove(workflowRuntimeMetadataParamKey) + ..remove(workflowParentRunIdParamKey); return UnmodifiableMapView(copy); } } + +String _stringOrDefault( + Map json, + String key, + String fallback, +) { + final raw = json[key]?.toString().trim(); + if (raw == null || raw.isEmpty) return fallback; + return raw; +} diff --git a/packages/stem/lib/src/workflow/core/workflow_script.dart b/packages/stem/lib/src/workflow/core/workflow_script.dart index 467885de..2361b593 100644 --- a/packages/stem/lib/src/workflow/core/workflow_script.dart +++ b/packages/stem/lib/src/workflow/core/workflow_script.dart @@ -1,6 +1,9 @@ import 'package:stem/src/core/payload_codec.dart'; -import 'package:stem/src/workflow/core/flow_step.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_checkpoint.dart'; import 'package:stem/src/workflow/core/workflow_definition.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; /// High-level workflow facade that allows scripts to be authored as a single /// async function using `step`, `sleep`, and `awaitEvent` helpers. @@ -13,23 +16,365 @@ class WorkflowScript { WorkflowScript({ required String name, required WorkflowScriptBody run, - Iterable steps = const [], - Iterable checkpoints = const [], + Iterable checkpoints = const [], String? version, String? description, Map? metadata, PayloadCodec? resultCodec, + T Function(Map payload)? decodeResultJson, + String? resultTypeName, }) : definition = WorkflowDefinition.script( name: name, run: run, - steps: steps, checkpoints: checkpoints, version: version, description: description, metadata: metadata, resultCodec: resultCodec, + decodeResultJson: decodeResultJson, + resultTypeName: resultTypeName, ); + /// Creates a script definition whose final result uses a custom payload + /// codec. + factory WorkflowScript.codec({ + required String name, + required WorkflowScriptBody run, + required PayloadCodec resultCodec, + Iterable checkpoints = const [], + String? version, + String? description, + Map? metadata, + }) { + return WorkflowScript( + name: name, + run: run, + checkpoints: checkpoints, + version: version, + description: description, + metadata: metadata, + resultCodec: resultCodec, + ); + } + + /// Creates a script definition whose final result is a DTO-backed JSON + /// value. + factory WorkflowScript.json({ + required String name, + required WorkflowScriptBody run, + required T Function(Map payload) decodeResult, + Iterable checkpoints = const [], + String? version, + String? description, + Map? metadata, + String? resultTypeName, + }) { + return WorkflowScript( + name: name, + run: run, + checkpoints: checkpoints, + version: version, + description: description, + metadata: metadata, + decodeResultJson: decodeResult, + resultTypeName: resultTypeName, + ); + } + + /// Creates a script definition whose final result is a versioned DTO-backed + /// JSON value. + factory WorkflowScript.versionedJson({ + required String name, + required WorkflowScriptBody run, + required int version, + required T Function(Map payload, int version) decodeResult, + Iterable checkpoints = const [], + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowScript( + name: name, + run: run, + checkpoints: checkpoints, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedJson( + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a script definition whose final result uses a reusable version + /// registry. + factory WorkflowScript.versionedJsonRegistry({ + required String name, + required WorkflowScriptBody run, + required int version, + required PayloadVersionRegistry resultRegistry, + Iterable checkpoints = const [], + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowScript( + name: name, + run: run, + checkpoints: checkpoints, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedJsonRegistry( + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a script definition whose final result is a versioned custom map + /// payload. + factory WorkflowScript.versionedMap({ + required String name, + required WorkflowScriptBody run, + required Object? Function(T value) encodeResult, + required int version, + required T Function(Map payload, int version) decodeResult, + Iterable checkpoints = const [], + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowScript( + name: name, + run: run, + checkpoints: checkpoints, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedMap( + encode: encodeResult, + version: version, + decode: decodeResult, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + + /// Creates a script definition whose final result is a versioned custom map + /// payload decoded through a reusable registry. + factory WorkflowScript.versionedMapRegistry({ + required String name, + required WorkflowScriptBody run, + required Object? Function(T value) encodeResult, + required int version, + required PayloadVersionRegistry resultRegistry, + Iterable checkpoints = const [], + String? workflowVersion, + String? description, + Map? metadata, + int? defaultDecodeVersion, + String? resultTypeName, + }) { + return WorkflowScript( + name: name, + run: run, + checkpoints: checkpoints, + version: workflowVersion, + description: description, + metadata: metadata, + resultCodec: PayloadCodec.versionedMapRegistry( + encode: encodeResult, + version: version, + registry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + typeName: resultTypeName ?? '$T', + ), + ); + } + /// The constructed workflow definition. final WorkflowDefinition definition; + + /// Builds a typed [WorkflowRef] using this script's registered workflow name + /// and result decoder. + WorkflowRef ref({ + required Map Function(TParams params) encodeParams, + }) { + return definition.ref(encodeParams: encodeParams); + } + + /// Builds a typed [WorkflowRef] backed by a DTO [paramsCodec]. + WorkflowRef refCodec({ + required PayloadCodec paramsCodec, + }) { + return definition.refCodec(paramsCodec: paramsCodec); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()`. + WorkflowRef refJson({ + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refJson( + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()` and persist a schema [version] beside the payload. + WorkflowRef refVersionedJson({ + required int version, + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedJson( + version: version, + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for DTO params that already expose + /// `toJson()` and decode versioned results through a reusable registry. + WorkflowRef refVersionedJsonRegistry({ + required int version, + required PayloadVersionRegistry resultRegistry, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedJsonRegistry( + version: version, + resultRegistry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for custom map params that persist a schema + /// [version] beside the payload. + WorkflowRef refVersionedMap({ + required Object? Function(TParams params) encodeParams, + required int version, + T Function(Map payload)? decodeResultJson, + T Function(Map payload, int version)? + decodeResultVersionedJson, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedMap( + encodeParams: encodeParams, + version: version, + decodeResultJson: decodeResultJson, + decodeResultVersionedJson: decodeResultVersionedJson, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [WorkflowRef] for custom map params that persist a schema + /// [version] and decode versioned results through a reusable registry. + WorkflowRef refVersionedMapRegistry({ + required Object? Function(TParams params) encodeParams, + required int version, + required PayloadVersionRegistry resultRegistry, + int? defaultDecodeVersion, + String? paramsTypeName, + String? resultTypeName, + }) { + return definition.refVersionedMapRegistry( + encodeParams: encodeParams, + version: version, + resultRegistry: resultRegistry, + defaultDecodeVersion: defaultDecodeVersion, + paramsTypeName: paramsTypeName, + resultTypeName: resultTypeName, + ); + } + + /// Builds a typed [NoArgsWorkflowRef] for scripts without start params. + NoArgsWorkflowRef ref0() { + return definition.ref0(); + } + + /// Starts this script directly when it does not accept start params. + Future start( + WorkflowCaller caller, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + return ref0().start( + caller, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + /// Starts this script directly and waits for completion. + Future?> startAndWait( + WorkflowCaller caller, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return ref0().startAndWait( + caller, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + pollInterval: pollInterval, + timeout: timeout, + ); + } + + /// Waits for [runId] using this script's result decoding rules. + Future?> waitFor( + WorkflowCaller caller, + String runId, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return ref0().waitFor( + caller, + runId, + pollInterval: pollInterval, + timeout: timeout, + ); + } } diff --git a/packages/stem/lib/src/workflow/core/workflow_script_context.dart b/packages/stem/lib/src/workflow/core/workflow_script_context.dart index 420495f1..214edfbb 100644 --- a/packages/stem/lib/src/workflow/core/workflow_script_context.dart +++ b/packages/stem/lib/src/workflow/core/workflow_script_context.dart @@ -1,7 +1,13 @@ import 'dart:async'; import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; import 'package:stem/src/workflow/core/flow_context.dart' show FlowContext; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_execution_context.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; /// Runtime context exposed to workflow scripts. Implementations are provided by /// the workflow runtime so scripts can execute with durable semantics. @@ -16,8 +22,9 @@ abstract class WorkflowScriptContext { /// Parameters supplied when the workflow was started. Map get params; - /// Invokes or replays a workflow step. The provided [handler] persists its - /// return value and the resolved value is replayed on subsequent runs. + /// Invokes or replays a workflow checkpoint. The provided [handler] + /// persists its return value and the resolved value is replayed on + /// subsequent runs. Future step( String name, FutureOr Function(WorkflowScriptStepContext context) handler, { @@ -25,32 +32,354 @@ abstract class WorkflowScriptContext { }); } -/// Context provided to each script step invocation. Mirrors [FlowContext] but -/// tailored for the facade helpers. -abstract class WorkflowScriptStepContext { +/// Low-level suspension helpers for workflow script checkpoints. +extension WorkflowScriptStepSuspensionJson on WorkflowScriptStepContext { + /// Suspends the workflow for [duration] with a JSON-serializable DTO payload. + Future sleepJson(Duration duration, T value, {String? typeName}) { + return sleep( + duration, + data: Map.from( + PayloadCodec.encodeJsonMap(value, typeName: typeName), + ), + ); + } + + /// Suspends the workflow for [duration] with a versioned DTO payload. + Future sleepVersionedJson( + Duration duration, + T value, { + required int version, + String? typeName, + }) { + return sleep( + duration, + data: Map.from( + PayloadCodec.encodeVersionedJsonMap( + value, + version: version, + typeName: typeName, + ), + ), + ); + } + + /// Suspends the workflow until [topic] arrives with a DTO payload. + Future awaitEventJson( + String topic, + T value, { + DateTime? deadline, + String? typeName, + }) { + return awaitEvent( + topic, + deadline: deadline, + data: Map.from( + PayloadCodec.encodeJsonMap(value, typeName: typeName), + ), + ); + } + + /// Suspends the workflow until [topic] arrives with a versioned DTO payload. + Future awaitEventVersionedJson( + String topic, + T value, { + required int version, + DateTime? deadline, + String? typeName, + }) { + return awaitEvent( + topic, + deadline: deadline, + data: Map.from( + PayloadCodec.encodeVersionedJsonMap( + value, + version: version, + typeName: typeName, + ), + ), + ); + } +} + +/// Typed read helpers for workflow start parameters in script run methods. +extension WorkflowScriptContextParams on WorkflowScriptContext { + /// Decodes the full workflow start-parameter payload through [codec]. + T paramsAs({required PayloadCodec codec}) { + return codec.decode(params); + } + + /// Decodes the full workflow start-parameter payload as a DTO. + T paramsJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(params); + } + + /// Decodes the full workflow start-parameter payload as a version-aware + /// DTO. + T paramsVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(params); + } + + /// Returns the decoded workflow parameter for [key], or `null`. + T? param(String key, {PayloadCodec? codec}) { + return params.value(key, codec: codec); + } + + /// Returns the decoded workflow parameter for [key], or [fallback]. + T paramOr(String key, T fallback, {PayloadCodec? codec}) { + return params.valueOr(key, fallback, codec: codec); + } + + /// Returns the decoded workflow parameter for [key], throwing when absent. + T requiredParam(String key, {PayloadCodec? codec}) { + return params.requiredValue(key, codec: codec); + } + + /// Returns the decoded workflow parameter DTO for [key], or `null`. + T? paramJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO for [key], or [fallback]. + T paramJsonOr( + String key, + T fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueJsonOr( + key, + fallback, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO for [key], throwing when + /// absent. + T requiredParamJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.requiredValueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO for [key], or + /// `null`. + T? paramVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO for [key], or + /// [fallback]. + T paramVersionedJsonOr( + String key, + T fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueVersionedJsonOr( + key, + fallback, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO for [key], + /// throwing when absent. + T requiredParamVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.requiredValueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO list for [key], or `null`. + List? paramListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueListJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO list for [key], or [fallback]. + List paramListJsonOr( + String key, + List fallback, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.valueListJsonOr( + key, + fallback, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded workflow parameter DTO list for [key], throwing when + /// absent. + List requiredParamListJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + return params.requiredValueListJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO list for [key], + /// or `null`. + List? paramListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO list for [key], + /// or [fallback]. + List paramListVersionedJsonOr( + String key, + List fallback, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.valueListVersionedJsonOr( + key, + fallback, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Returns the decoded version-aware workflow parameter DTO list for [key], + /// throwing when absent. + List requiredParamListVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + return params.requiredValueListVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } +} + +/// Context provided to each script checkpoint invocation. Mirrors +/// [FlowContext] but tailored for the facade helpers. +abstract class WorkflowScriptStepContext implements WorkflowExecutionContext { /// Name of the workflow currently executing. + @override String get workflow; /// Identifier for the workflow run. + @override String get runId; - /// Name of the current step. + /// Name of the current checkpoint. + @override String get stepName; - /// Zero-based step index in the workflow definition. + /// Zero-based checkpoint index in the workflow definition. + @override int get stepIndex; - /// Iteration count for looped steps. + /// Iteration count for looped checkpoints. + @override int get iteration; /// Parameters provided when the workflow started. + @override Map get params; - /// Result of the previous step, if any. + /// Result of the previous checkpoint, if any. + @override Object? get previousResult; - /// Schedules a wake-up after [duration]. The workflow suspends once the step - /// handler returns. + /// Schedules a wake-up after [duration]. The workflow suspends once the + /// checkpoint handler returns. Future sleep(Duration duration, {Map? data}); /// Suspends the workflow until the given [topic] is emitted. @@ -61,12 +390,79 @@ abstract class WorkflowScriptStepContext { }); /// Returns and clears the resume payload provided by the runtime when the - /// step resumes after a suspension. + /// checkpoint resumes after a suspension. + @override Object? takeResumeData(); - /// Returns a stable idempotency key derived from workflow/run/step. + @override + Future suspendFor( + Duration duration, { + Map? data, + }) { + return sleep(duration, data: data); + } + + @override + Future waitForTopic( + String topic, { + DateTime? deadline, + Map? data, + }) { + return awaitEvent(topic, deadline: deadline, data: data); + } + + /// Returns a stable idempotency key derived from workflow/run/checkpoint. + @override String idempotencyKey([String? scope]); /// Optional enqueuer for scheduling tasks with workflow metadata. + @override TaskEnqueuer? get enqueuer; + + /// Optional typed workflow caller for spawning child workflows. + @override + WorkflowCaller? get workflows; + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }); + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }); + + /// Starts a typed child workflow using this checkpoint context. + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }); + + /// Starts a prebuilt child workflow call using this checkpoint context. + @override + Future startWorkflowCall( + WorkflowStartCall call, + ); + + /// Waits for a typed child workflow using this checkpoint context. + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }); } diff --git a/packages/stem/lib/src/workflow/core/workflow_step_entry.dart b/packages/stem/lib/src/workflow/core/workflow_step_entry.dart index 250bb3cb..f1b4aafa 100644 --- a/packages/stem/lib/src/workflow/core/workflow_step_entry.dart +++ b/packages/stem/lib/src/workflow/core/workflow_step_entry.dart @@ -1,3 +1,5 @@ +import 'package:stem/src/core/payload_codec.dart'; + /// Persisted step checkpoint metadata for a workflow run. class WorkflowStepEntry { /// Creates a workflow step entry snapshot. @@ -30,6 +32,44 @@ class WorkflowStepEntry { /// Optional timestamp when the checkpoint was recorded. final DateTime? completedAt; + /// Decodes the persisted checkpoint value with [codec], when present. + TValue? valueAs({required PayloadCodec codec}) { + final stored = value; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the persisted checkpoint value with a JSON decoder, when present. + TValue? valueJson({ + required TValue Function(Map payload) decode, + String? typeName, + }) { + final stored = value; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the persisted checkpoint value with a version-aware JSON decoder, + /// when present. + TValue? valueVersionedJson({ + required int version, + required TValue Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = value; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Base step name without any auto-version suffix. String get baseName { final hashIndex = name.indexOf('#'); diff --git a/packages/stem/lib/src/workflow/core/workflow_store.dart b/packages/stem/lib/src/workflow/core/workflow_store.dart index 029a7f2e..c064e54e 100644 --- a/packages/stem/lib/src/workflow/core/workflow_store.dart +++ b/packages/stem/lib/src/workflow/core/workflow_store.dart @@ -8,9 +8,9 @@ import 'package:stem/src/workflow/core/workflow_watcher.dart'; abstract class WorkflowStore { /// Creates a new workflow run record and returns its run id. Future createRun({ - String? runId, required String workflow, required Map params, + String? runId, String? parentRunId, Duration? ttl, diff --git a/packages/stem/lib/src/workflow/core/workflow_watcher.dart b/packages/stem/lib/src/workflow/core/workflow_watcher.dart index 9ec150b6..7b48a00d 100644 --- a/packages/stem/lib/src/workflow/core/workflow_watcher.dart +++ b/packages/stem/lib/src/workflow/core/workflow_watcher.dart @@ -1,4 +1,5 @@ import 'package:stem/src/core/clock.dart'; +import 'package:stem/src/core/payload_codec.dart'; /// Describes a workflow event watcher registered by the runtime. class WorkflowWatcher { @@ -42,6 +43,37 @@ class WorkflowWatcher { /// Additional metadata supplied when the watcher was registered. final Map data; + /// Decodes the full watcher metadata with [codec]. + TData dataAs({required PayloadCodec codec}) { + return codec.decode(data); + } + + /// Decodes the full watcher metadata with a JSON decoder. + TData dataJson({ + required TData Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(data); + } + + /// Decodes the full watcher metadata with a version-aware JSON decoder. + TData dataVersionedJson({ + required int version, + required TData Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(data); + } + /// Suspension type (`sleep`, `event`, etc.) when recorded by runtime. String? get suspensionType => data['type']?.toString(); @@ -54,6 +86,48 @@ class WorkflowWatcher { /// Effective payload snapshot captured at suspension time. Object? get payload => data['payload']; + /// Decodes the captured watcher payload with [codec], when present. + TPayload? payloadAs({required PayloadCodec codec}) { + final stored = payload; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the captured watcher payload with a JSON decoder, when present. + TPayload? payloadJson({ + required TPayload Function(Map payload) decode, + String? typeName, + }) { + final stored = payload; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the captured watcher payload with a version-aware JSON decoder, + /// when present. + TPayload? payloadVersionedJson({ + required int version, + required TPayload Function( + Map payload, + int version, + ) + decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = payload; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Timestamp when suspension was recorded. DateTime? get suspendedAt => _dateFromJson(data['suspendedAt']); @@ -109,6 +183,37 @@ class WorkflowWatcherResolution { /// Resume data merged from stored metadata and event payload. final Map resumeData; + /// Decodes the full resume data payload with [codec]. + TData resumeDataAs({required PayloadCodec codec}) { + return codec.decode(resumeData); + } + + /// Decodes the full resume data payload with a JSON decoder. + TData resumeDataJson({ + required TData Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(resumeData); + } + + /// Decodes the full resume data payload with a version-aware JSON decoder. + TData resumeDataVersionedJson({ + required int version, + required TData Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(resumeData); + } + /// Suspension type (`sleep`, `event`, etc.) propagated to resume payload. String? get suspensionType => resumeData['type']?.toString(); @@ -121,6 +226,48 @@ class WorkflowWatcherResolution { /// Resume payload delivered to workflow step. Object? get payload => resumeData['payload']; + /// Decodes the resume payload with [codec], when present. + TPayload? payloadAs({required PayloadCodec codec}) { + final stored = payload; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the resume payload with a JSON decoder, when present. + TPayload? payloadJson({ + required TPayload Function(Map payload) decode, + String? typeName, + }) { + final stored = payload; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the resume payload with a version-aware JSON decoder, when + /// present. + TPayload? payloadVersionedJson({ + required int version, + required TPayload Function( + Map payload, + int version, + ) + decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = payload; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Timestamp when event delivery was recorded. DateTime? get deliveredAt => _dateFromJson(resumeData['deliveredAt']); diff --git a/packages/stem/lib/src/workflow/runtime/workflow_introspection.dart b/packages/stem/lib/src/workflow/runtime/workflow_introspection.dart index a5c61c66..1b3e6fb2 100644 --- a/packages/stem/lib/src/workflow/runtime/workflow_introspection.dart +++ b/packages/stem/lib/src/workflow/runtime/workflow_introspection.dart @@ -1,3 +1,5 @@ +import 'package:stem/src/core/payload_codec.dart'; +import 'package:stem/src/core/payload_map.dart'; import 'package:stem/src/core/stem_event.dart'; /// Enumerates workflow step event types emitted by the runtime. @@ -57,12 +59,137 @@ class WorkflowStepEvent implements StemEvent { /// Optional result payload for completed steps. final Object? result; + /// Decodes the step result payload with [codec], when present. + TResult? resultAs({required PayloadCodec codec}) { + final stored = result; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the step result payload with a JSON decoder, when present. + TResult? resultJson({ + required TResult Function(Map payload) decode, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the step result payload with a version-aware JSON decoder, when + /// present. + TResult? resultVersionedJson({ + required int version, + required TResult Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Optional error message for failed steps. final String? error; /// Optional metadata associated with the event. final Map? metadata; + /// Returns the decoded metadata value for [key], or `null` when absent. + T? metadataValue(String key, {PayloadCodec? codec}) { + final payload = metadata; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Decodes the metadata value for [key] as a typed DTO with [codec]. + T? metadataAs(String key, {required PayloadCodec codec}) { + final payload = metadata; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Decodes the metadata value for [key] as a typed DTO with a JSON decoder. + T? metadataJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return payload.valueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Decodes the metadata value for [key] as a typed DTO with a version-aware + /// JSON decoder. + T? metadataVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return payload.valueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Decodes the full metadata payload as a typed DTO with [codec]. + T? metadataPayloadAs({required PayloadCodec codec}) { + final payload = metadata; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the full metadata payload as a typed DTO with a JSON decoder. + T? metadataPayloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full metadata payload as a typed DTO with a version-aware + /// JSON decoder. + T? metadataPayloadVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(payload); + } + @override String get eventName => 'workflow.step.${type.name}'; @@ -107,6 +234,93 @@ class WorkflowRuntimeEvent implements StemEvent { /// Additional event metadata. final Map? metadata; + /// Returns the decoded metadata value for [key], or `null` when absent. + T? metadataValue(String key, {PayloadCodec? codec}) { + final payload = metadata; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Decodes the metadata value for [key] as a typed DTO with [codec]. + T? metadataAs(String key, {required PayloadCodec codec}) { + final payload = metadata; + if (payload == null) return null; + return payload.value(key, codec: codec); + } + + /// Decodes the metadata value for [key] as a typed DTO with a JSON decoder. + T? metadataJson( + String key, { + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return payload.valueJson( + key, + decode: decode, + typeName: typeName, + ); + } + + /// Decodes the metadata value for [key] as a typed DTO with a version-aware + /// JSON decoder. + T? metadataVersionedJson( + String key, { + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return payload.valueVersionedJson( + key, + defaultVersion: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ); + } + + /// Decodes the full metadata payload as a typed DTO with [codec]. + T? metadataPayloadAs({required PayloadCodec codec}) { + final payload = metadata; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the full metadata payload as a typed DTO with a JSON decoder. + T? metadataPayloadJson({ + required T Function(Map payload) decode, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the full metadata payload as a typed DTO with a version-aware + /// JSON decoder. + T? metadataPayloadVersionedJson({ + required T Function(Map payload, int version) decode, + int defaultVersion = 1, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = metadata; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: defaultVersion, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion ?? defaultVersion, + typeName: typeName, + ).decode(payload); + } + @override String get eventName => 'workflow.runtime.${type.name}'; diff --git a/packages/stem/lib/src/workflow/runtime/workflow_manifest.dart b/packages/stem/lib/src/workflow/runtime/workflow_manifest.dart index 091b4b32..311456b5 100644 --- a/packages/stem/lib/src/workflow/runtime/workflow_manifest.dart +++ b/packages/stem/lib/src/workflow/runtime/workflow_manifest.dart @@ -3,15 +3,6 @@ import 'dart:convert'; import 'package:stem/src/workflow/core/flow_step.dart'; import 'package:stem/src/workflow/core/workflow_definition.dart'; -/// Distinguishes between declared flow steps and script checkpoints. -enum WorkflowManifestStepRole { - /// Step that belongs to a declarative flow execution plan. - flowStep, - - /// Checkpoint declared by a script workflow for tooling/introspection. - scriptCheckpoint, -} - /// Immutable manifest entry describing a workflow definition. class WorkflowManifestEntry { /// Creates a workflow manifest entry. @@ -23,6 +14,7 @@ class WorkflowManifestEntry { this.description, this.metadata, this.steps = const [], + this.checkpoints = const [], }); /// Stable workflow identifier. @@ -43,21 +35,18 @@ class WorkflowManifestEntry { /// Optional workflow metadata. final Map? metadata; - /// Declared flow steps or script checkpoints. - final List steps; + /// Declared flow steps. + final List steps; + + /// Declared script checkpoints. + final List checkpoints; /// Human-friendly label for the declared nodes on this workflow. - String get stepCollectionLabel => + String get declaredNodeLabel => kind == WorkflowDefinitionKind.script ? 'checkpoints' : 'steps'; - /// Alias for [steps] when treating script nodes as checkpoints. - List get checkpoints => steps; - /// Serializes this entry to a JSON-compatible map. Map toJson() { - final serializedSteps = steps - .map((step) => step.toJson()) - .toList(growable: false); return { 'id': id, 'name': name, @@ -65,21 +54,24 @@ class WorkflowManifestEntry { if (version != null) 'version': version, if (description != null) 'description': description, if (metadata != null) 'metadata': metadata, - 'stepCollectionLabel': stepCollectionLabel, - 'steps': serializedSteps, - if (kind == WorkflowDefinitionKind.script) 'checkpoints': serializedSteps, + 'declaredNodeLabel': declaredNodeLabel, + if (steps.isNotEmpty) + 'steps': steps.map((step) => step.toJson()).toList(growable: false), + if (checkpoints.isNotEmpty) + 'checkpoints': checkpoints + .map((checkpoint) => checkpoint.toJson()) + .toList(growable: false), }; } } -/// Immutable manifest entry describing a workflow step or script checkpoint. -class WorkflowManifestStep { - /// Creates a workflow step manifest entry. - const WorkflowManifestStep({ +/// Immutable manifest entry describing a declared flow step. +class WorkflowManifestFlowStep { + /// Creates a flow step manifest entry. + const WorkflowManifestFlowStep({ required this.id, required this.name, required this.position, - required this.role, required this.kind, required this.autoVersion, this.title, @@ -96,9 +88,6 @@ class WorkflowManifestStep { /// Zero-based position in the workflow. final int position; - /// Whether this node is part of a flow plan or a script checkpoint list. - final WorkflowManifestStepRole role; - /// Step kind. final WorkflowStepKind kind; @@ -120,7 +109,59 @@ class WorkflowManifestStep { 'id': id, 'name': name, 'position': position, - 'role': role.name, + 'kind': kind.name, + 'autoVersion': autoVersion, + if (title != null) 'title': title, + if (taskNames.isNotEmpty) 'taskNames': taskNames, + if (metadata != null) 'metadata': metadata, + }; + } +} + +/// Immutable manifest entry describing a declared script checkpoint. +class WorkflowManifestCheckpoint { + /// Creates a script checkpoint manifest entry. + const WorkflowManifestCheckpoint({ + required this.id, + required this.name, + required this.position, + required this.kind, + required this.autoVersion, + this.title, + this.taskNames = const [], + this.metadata, + }); + + /// Stable checkpoint identifier. + final String id; + + /// Checkpoint name. + final String name; + + /// Zero-based position in the script declaration list. + final int position; + + /// Checkpoint kind. + final WorkflowStepKind kind; + + /// Whether this checkpoint auto-versions replays. + final bool autoVersion; + + /// Optional title. + final String? title; + + /// Associated task names. + final List taskNames; + + /// Optional checkpoint metadata. + final Map? metadata; + + /// Serializes this entry to a JSON-compatible map. + Map toJson() { + return { + 'id': id, + 'name': name, + 'position': position, 'kind': kind.name, 'autoVersion': autoVersion, if (title != null) 'title': title, @@ -135,24 +176,40 @@ extension WorkflowManifestDefinition on WorkflowDefinition { /// Builds a manifest entry for this definition. WorkflowManifestEntry toManifestEntry() { final workflowId = stableId; - final stepEntries = []; - for (var index = 0; index < steps.length; index += 1) { - final step = steps[index]; - stepEntries.add( - WorkflowManifestStep( - id: _stableHexDigest('$workflowId:${step.name}:$index'), - name: step.name, - position: index, - role: isScript - ? WorkflowManifestStepRole.scriptCheckpoint - : WorkflowManifestStepRole.flowStep, - kind: step.kind, - autoVersion: step.autoVersion, - title: step.title, - taskNames: step.taskNames, - metadata: step.metadata, - ), - ); + final stepEntries = []; + final checkpointEntries = []; + if (isScript) { + for (var index = 0; index < checkpoints.length; index += 1) { + final checkpoint = checkpoints[index]; + checkpointEntries.add( + WorkflowManifestCheckpoint( + id: _stableHexDigest('$workflowId:${checkpoint.name}:$index'), + name: checkpoint.name, + position: index, + kind: checkpoint.kind, + autoVersion: checkpoint.autoVersion, + title: checkpoint.title, + taskNames: checkpoint.taskNames, + metadata: checkpoint.metadata, + ), + ); + } + } else { + for (var index = 0; index < steps.length; index += 1) { + final step = steps[index]; + stepEntries.add( + WorkflowManifestFlowStep( + id: _stableHexDigest('$workflowId:${step.name}:$index'), + name: step.name, + position: index, + kind: step.kind, + autoVersion: step.autoVersion, + title: step.title, + taskNames: step.taskNames, + metadata: step.metadata, + ), + ); + } } return WorkflowManifestEntry( id: workflowId, @@ -164,14 +221,17 @@ extension WorkflowManifestDefinition on WorkflowDefinition { description: description, metadata: metadata, steps: stepEntries, + checkpoints: checkpointEntries, ); } } String _stableHexDigest(String input) { final bytes = utf8.encode(input); + // FNV-1a uses this exact 64-bit offset basis; keep the literal stable. + // ignore: avoid_js_rounded_ints var hash = 0xcbf29ce484222325; - const prime = 0x00000100000001B3; + const prime = 0x100000001b3; for (final value in bytes) { hash ^= value; hash = (hash * prime) & 0xFFFFFFFFFFFFFFFF; diff --git a/packages/stem/lib/src/workflow/runtime/workflow_runtime.dart b/packages/stem/lib/src/workflow/runtime/workflow_runtime.dart index 18aa338b..94f318cb 100644 --- a/packages/stem/lib/src/workflow/runtime/workflow_runtime.dart +++ b/packages/stem/lib/src/workflow/runtime/workflow_runtime.dart @@ -43,7 +43,10 @@ import 'package:stem/src/workflow/core/run_state.dart'; import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; import 'package:stem/src/workflow/core/workflow_clock.dart'; import 'package:stem/src/workflow/core/workflow_definition.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; +import 'package:stem/src/workflow/core/workflow_resume.dart'; import 'package:stem/src/workflow/core/workflow_runtime_metadata.dart'; import 'package:stem/src/workflow/core/workflow_script_context.dart'; import 'package:stem/src/workflow/core/workflow_status.dart'; @@ -66,7 +69,7 @@ const int _leaseConflictMaxRetries = 1000000; /// The runtime is durable: each step is re-executed from the top after a /// suspension or worker crash. Handlers must therefore be idempotent and rely /// on persisted step outputs or resume payloads to detect prior progress. -class WorkflowRuntime { +class WorkflowRuntime implements WorkflowCaller, WorkflowEventEmitter { /// Creates a workflow runtime backed by a [Stem] instance and /// [WorkflowStore]. WorkflowRuntime({ @@ -175,14 +178,15 @@ class WorkflowRuntime { continuationQueue: continuationQueue, executionQueue: executionQueue, serializationFormat: _stem.payloadEncoders.defaultArgsEncoder.id, - serializationVersion: '1', frameFormat: 'stem-envelope', - frameVersion: '1', encryptionScope: _stem.signer != null ? 'signed-envelope' : 'none', encryptionEnabled: _stem.signer != null, streamId: '${name}_$requestedRunId', ); - final persistedParams = runtimeMetadata.attachToParams(params); + final persistedParams = runtimeMetadata.attachToParams( + params, + parentRunId: parentRunId, + ); final runId = await _store.createRun( runId: requestedRunId, workflow: name, @@ -208,7 +212,78 @@ class WorkflowRuntime { return runId; } + /// Persists a new workflow run from a DTO that already exposes `toJson()`. + Future startWorkflowJson( + String name, + T paramsJson, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + String? typeName, + }) { + return startWorkflow( + name, + params: Map.from( + PayloadCodec.encodeJsonMap( + paramsJson, + typeName: typeName ?? '$T', + ), + ), + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + /// Persists a new workflow run from a typed value plus optional [codec]. + /// + /// When [codec] is omitted, [value] must already be a string-keyed durable + /// map payload. + Future startWorkflowValue( + String name, + T value, { + PayloadCodec? codec, + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + return startWorkflow( + name, + params: _encodeWorkflowStartValue(name, value, codec: codec), + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + /// Persists a new workflow run from a DTO and stores a schema [version] + /// beside the JSON payload. + Future startWorkflowVersionedJson( + String name, + T paramsJson, { + required int version, + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + String? typeName, + }) { + return startWorkflow( + name, + params: Map.from( + PayloadCodec.encodeVersionedJsonMap( + paramsJson, + version: version, + typeName: typeName ?? '$T', + ), + ), + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + /// Starts a workflow from a typed [WorkflowRef]. + @override Future startWorkflowRef( WorkflowRef definition, TParams params, { @@ -226,6 +301,7 @@ class WorkflowRuntime { } /// Starts a workflow from a prebuilt [WorkflowStartCall]. + @override Future startWorkflowCall( WorkflowStartCall call, ) { @@ -238,6 +314,67 @@ class WorkflowRuntime { ); } + /// Waits for [runId] to reach a terminal state. + Future?> waitForCompletion( + String runId, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + T Function(Object? payload)? decode, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, + }) async { + assert( + [decode, decodeJson, decodeVersionedJson] + .whereType() + .length <= + 1, + 'Specify at most one of decode, decodeJson, or decodeVersionedJson.', + ); + final startedAt = _clock.now(); + while (true) { + final state = await _store.get(runId); + if (state == null) { + return null; + } + if (state.isTerminal) { + return _buildResult( + state, + decode, + decodeJson: decodeJson, + decodeVersionedJson: decodeVersionedJson, + timedOut: false, + ); + } + if (timeout != null && _clock.now().difference(startedAt) >= timeout) { + return _buildResult( + state, + decode, + decodeJson: decodeJson, + decodeVersionedJson: decodeVersionedJson, + timedOut: true, + ); + } + await Future.delayed(pollInterval); + } + } + + /// Waits for [runId] using the decoding rules from a [WorkflowRef]. + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return waitForCompletion( + runId, + pollInterval: pollInterval, + timeout: timeout, + decode: definition.decode, + ); + } + /// Emits an external event and resumes all runs waiting on [topic]. /// /// Each resumed run receives the event as `resumeData` for the awaiting step @@ -276,12 +413,98 @@ class WorkflowRuntime { } } + /// Emits a DTO-backed external event without requiring a manual payload map. + Future emitJson( + String topic, + T payloadJson, { + String? typeName, + }) { + return emit( + topic, + Map.from( + PayloadCodec.encodeJsonMap( + payloadJson, + typeName: typeName ?? '$T', + ), + ), + ); + } + + /// Emits a DTO-backed external event and stores a schema [version] beside + /// the JSON payload. + Future emitVersionedJson( + String topic, + T payloadJson, { + required int version, + String? typeName, + }) { + return emit( + topic, + Map.from( + PayloadCodec.encodeVersionedJsonMap( + payloadJson, + version: version, + typeName: typeName ?? '$T', + ), + ), + ); + } + + WorkflowResult _buildResult( + RunState state, + T Function(Object? payload)? decode, { + required bool timedOut, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, + }) { + final value = state.status == WorkflowStatus.completed + ? _decodeResult( + state.result, + decode, + decodeJson, + decodeVersionedJson, + ) + : null; + return WorkflowResult( + runId: state.id, + status: state.status, + state: state, + value: value, + rawResult: state.result, + timedOut: timedOut && !state.isTerminal, + ); + } + + T? _decodeResult( + Object? payload, + T Function(Object? payload)? decode, + T Function(Map payload)? decodeJson, + T Function(Map payload, int version)? decodeVersionedJson, + ) { + if (decode != null) { + return decode(payload); + } + if (decodeVersionedJson != null) { + return decodeVersionedJson( + PayloadCodec.decodeJsonMap(payload, typeName: 'workflow result'), + PayloadCodec.readPayloadVersion(payload), + ); + } + if (decodeJson != null) { + return decodeJson( + PayloadCodec.decodeJsonMap(payload, typeName: 'workflow result'), + ); + } + return payload as T?; + } + /// Emits a typed external event that serializes to the existing map-based /// workflow event transport. /// /// When [codec] is provided, [value] is encoded before being emitted. The /// encoded value must be a `Map` because workflow watcher /// resolution and event transport are currently map-shaped. + @override Future emitValue( String topic, T value, { @@ -291,6 +514,12 @@ class WorkflowRuntime { return emit(topic, _coerceEventPayload(topic, encoded)); } + /// Emits a typed external event using a [WorkflowEventRef]. + @override + Future emitEvent(WorkflowEventRef event, T value) { + return emitValue(event.topic, value, codec: event.codec); + } + /// Starts periodic polling that resumes runs whose wake-up time has elapsed. Future start() async { if (_started) return; @@ -351,14 +580,14 @@ class WorkflowRuntime { return WorkflowRunView.fromState(state); } - /// Returns persisted step views for [runId]. - Future> viewSteps(String runId) async { + /// Returns persisted checkpoint views for [runId]. + Future> viewCheckpoints(String runId) async { final state = await _store.get(runId); if (state == null) return const []; - final steps = await _store.listSteps(runId); - return steps + final checkpoints = await _store.listSteps(runId); + return checkpoints .map( - (entry) => WorkflowStepView.fromEntry( + (entry) => WorkflowCheckpointView.fromEntry( runId: runId, workflow: state.workflow, entry: entry, @@ -367,12 +596,12 @@ class WorkflowRuntime { .toList(growable: false); } - /// Returns combined run+step drilldown view for [runId]. + /// Returns combined run+checkpoint drilldown view for [runId]. Future viewRunDetail(String runId) async { final run = await viewRun(runId); if (run == null) return null; - final steps = await viewSteps(runId); - return WorkflowRunDetailView(run: run, steps: steps); + final checkpoints = await viewCheckpoints(runId); + return WorkflowRunDetailView(run: run, checkpoints: checkpoints); } /// Returns uniform run views filtered by workflow/status. @@ -579,14 +808,18 @@ class WorkflowRuntime { baseMeta: stepMeta, targetExecutionQueue: runState.executionQueue, ), + workflows: _ChildWorkflowCaller(runtime: this, parentRunId: runId), ); resumeData = null; dynamic result; + var suspendedBySignal = false; try { result = await TaskEnqueueScope.run( stepMeta, () async => await step.handler(context), ); + } on WorkflowSuspensionSignal { + suspendedBySignal = true; } on _WorkflowLeaseLost { return; } catch (error, stack) { @@ -729,7 +962,7 @@ class WorkflowRuntime { step: step.name, extra: { 'workflowSuspensionType': 'event', - 'topic': control.topic!, + 'topic': control.topic, 'workflowIteration': iteration, if (deadline != null) 'deadline': deadline.toIso8601String(), 'runtimeId': _runtimeId, @@ -739,6 +972,12 @@ class WorkflowRuntime { } return; } + if (suspendedBySignal) { + throw StateError( + 'Flow step "${step.name}" threw WorkflowSuspensionSignal without ' + 'scheduling a suspension control.', + ); + } final storedResult = step.encodeValue(result); await _store.saveStep(runId, checkpointName, storedResult); @@ -814,14 +1053,15 @@ class WorkflowRuntime { ), ); } - final steps = await _store.listSteps(runId); + final checkpoints = await _store.listSteps(runId); final completedIterations = await _loadCompletedIterations(runId); Object? previousResult; - if (steps.isNotEmpty) { - previousResult = definition - .stepByName(steps.last.baseName) - ?.decodeValue(steps.last.value) ?? - steps.last.value; + if (checkpoints.isNotEmpty) { + previousResult = + definition + .checkpointByName(checkpoints.last.baseName) + ?.decodeValue(checkpoints.last.value) ?? + checkpoints.last.value; } final execution = _WorkflowScriptExecution( runtime: this, @@ -830,7 +1070,7 @@ class WorkflowRuntime { completedIterations: completedIterations, definition: definition, previousResult: previousResult, - initialStepIndex: steps.length, + initialStepIndex: checkpoints.length, suspensionData: runState.suspensionData, policy: runState.cancellationPolicy, ); @@ -946,7 +1186,7 @@ class WorkflowRuntime { final entries = await _store.listSteps(runId); final counts = {}; for (final entry in entries) { - final base = _baseStepName(entry.name); + final base = _basePersistedNodeName(entry.name); final suffix = _parseIterationSuffix(entry.name); final nextIndex = suffix != null ? suffix + 1 : 1; final current = counts[base] ?? 0; @@ -1010,8 +1250,8 @@ class WorkflowRuntime { return int.tryParse(suffix); } - /// Removes an iteration suffix from a versioned step name. - String _baseStepName(String name) { + /// Removes an iteration suffix from a persisted step/checkpoint name. + String _basePersistedNodeName(String name) { final hashIndex = name.indexOf('#'); if (hashIndex == -1) return name; return name.substring(0, hashIndex); @@ -1055,9 +1295,9 @@ class WorkflowRuntime { /// Enqueues a workflow run execution task. Future _enqueueRun( String runId, { - String? workflow, required bool continuation, required WorkflowContinuationReason reason, + String? workflow, WorkflowRunRuntimeMetadata? runtimeMetadata, }) async { final metadata = @@ -1287,6 +1527,35 @@ class WorkflowRuntime { } } +Map _encodeWorkflowStartValue( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'Workflow start payload for $name must use string keys, got ' + '${key.runtimeType}.', + ); + } + normalized[key] = entry.value; + } + return normalized; + } + throw StateError( + 'Workflow start payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); +} + /// Task handler that dispatches workflow run execution for a run id. class _WorkflowRunTaskHandler implements TaskHandler { _WorkflowRunTaskHandler({required this.runtime}); @@ -1354,10 +1623,10 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { int? _suspensionIteration; Object? _resumePayload; - /// Whether a script step suspended the run. + /// Whether a script checkpoint suspended the run. bool get wasSuspended => _wasSuspended; - /// Last executed step name, if any. + /// Last executed checkpoint name, if any. String? get lastStepName => _lastStepName; @override @@ -1375,7 +1644,7 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { FutureOr Function(WorkflowScriptStepContext context) handler, { bool autoVersion = false, }) async { - /// Executes a script step with checkpoint replay and suspension handling. + /// Executes a script checkpoint with replay and suspension handling. _lastStepName = name; final policy = this.policy; if (policy != null && policy.maxRunDuration != null) { @@ -1422,13 +1691,13 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { ), ); - final declaredStep = definition.stepByName(name); + final declaredCheckpoint = definition.checkpointByName(name); final cached = await runtime._store.readStep( runId, checkpointName, ); if (cached != null) { - final decodedCached = declaredStep?.decodeValue(cached) ?? cached; + final decodedCached = declaredCheckpoint?.decodeValue(cached) ?? cached; _previousResult = decodedCached; await runtime._recordStepEvent( WorkflowStepEventType.completed, @@ -1466,13 +1735,17 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { baseMeta: stepMeta, targetExecutionQueue: runState.executionQueue, ), + workflows: _ChildWorkflowCaller(runtime: runtime, parentRunId: runId), ); - T result; + late final T result; + var suspendedBySignal = false; try { result = await TaskEnqueueScope.run( stepMeta, () async => await handler(stepContext), ); + } on WorkflowSuspensionSignal { + suspendedBySignal = true; } catch (error, stack) { await runtime._recordStepEvent( WorkflowStepEventType.failed, @@ -1491,8 +1764,14 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { throw const _WorkflowScriptSuspended(); } } + if (suspendedBySignal) { + throw StateError( + 'Script checkpoint "$name" threw WorkflowSuspensionSignal without ' + 'scheduling a suspension control.', + ); + } - final storedResult = declaredStep?.encodeValue(result) ?? result; + final storedResult = declaredCheckpoint?.encodeValue(result) ?? result; await runtime._store.saveStep(runId, checkpointName, storedResult); await runtime._extendLeases(taskContext, runId); await runtime._recordStepEvent( @@ -1512,7 +1791,7 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { return result; } - /// Computes the next iteration for an auto-versioned step. + /// Computes the next iteration for an auto-versioned checkpoint. int _nextIteration(String name) { final completed = _completedIterations[name] ?? 0; if (_suspensionStep == name && _suspensionIteration != null) { @@ -1521,7 +1800,7 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { return completed; } - /// Returns resume payload if it matches the current step/iteration. + /// Returns resume payload if it matches the current checkpoint/iteration. Object? _takeResumePayload(String stepName, int? iteration) { final matchesStep = _suspensionStep == stepName; if (!matchesStep) return null; @@ -1643,7 +1922,7 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { step: stepName, extra: { 'workflowSuspensionType': 'event', - 'topic': control.topic!, + 'topic': control.topic, 'workflowIteration': iteration, if (deadline != null) 'deadline': deadline.toIso8601String(), 'runtimeId': runtime._runtimeId, @@ -1654,10 +1933,10 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { _wasSuspended = true; } - /// Previously completed step result, if any. + /// Previously completed checkpoint result, if any. Object? get previousResult => _previousResult; - /// Builds a stable idempotency key for a step/iteration scope. + /// Builds a stable idempotency key for a checkpoint/iteration scope. String idempotencyKey(String stepName, int iteration, [String? scope]) { final defaultScope = iteration > 0 ? '$stepName#$iteration' : stepName; final effectiveScope = (scope == null || scope.isEmpty) @@ -1667,7 +1946,7 @@ class _WorkflowScriptExecution implements WorkflowScriptContext { } } -/// Workflow script step context used by script-defined workflows. +/// Workflow script checkpoint context used by script-defined workflows. class _WorkflowScriptStepContextImpl implements WorkflowScriptStepContext { _WorkflowScriptStepContextImpl({ required this.execution, @@ -1676,6 +1955,7 @@ class _WorkflowScriptStepContextImpl implements WorkflowScriptStepContext { required int iteration, Object? resumeData, this.enqueuer, + this.workflows, }) : _stepName = stepName, _stepIndex = stepIndex, _iteration = iteration, @@ -1688,6 +1968,28 @@ class _WorkflowScriptStepContextImpl implements WorkflowScriptStepContext { _ScriptControl? _control; Object? _resumeData; + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeWorkflowStepValue(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } + /// Consumes any control signal emitted by the step. _ScriptControl? takeControl() { final value = _control; @@ -1709,6 +2011,23 @@ class _WorkflowScriptStepContextImpl implements WorkflowScriptStepContext { ); } + @override + Future suspendFor( + Duration duration, { + Map? data, + }) { + return sleep(duration, data: data); + } + + @override + Future waitForTopic( + String topic, { + DateTime? deadline, + Map? data, + }) { + return awaitEvent(topic, deadline: deadline, data: data); + } + @override /// Suspends the run until the sleep duration elapses. Future sleep(Duration duration, {Map? data}) async { @@ -1763,8 +2082,106 @@ class _WorkflowScriptStepContextImpl implements WorkflowScriptStepContext { @override final TaskEnqueuer? enqueuer; + @override + final WorkflowCaller? workflows; + @override String get workflow => execution.workflow; + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) async { + final delegate = enqueuer; + if (delegate == null) { + throw StateError('WorkflowScriptStepContext has no enqueuer configured'); + } + return delegate.enqueue( + name, + args: args, + headers: headers, + meta: meta, + options: options, + notBefore: notBefore, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) async { + final delegate = enqueuer; + if (delegate == null) { + throw StateError('WorkflowScriptStepContext has no enqueuer configured'); + } + return delegate.enqueueCall(call, enqueueOptions: enqueueOptions); + } + + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) async { + final caller = workflows; + if (caller == null) { + throw StateError( + 'WorkflowScriptStepContext has no workflow caller configured', + ); + } + return caller.startWorkflowRef( + definition, + params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) async { + final caller = workflows; + if (caller == null) { + throw StateError( + 'WorkflowScriptStepContext has no workflow caller configured', + ); + } + return caller.startWorkflowCall(call); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) async { + final caller = workflows; + if (caller == null) { + throw StateError( + 'WorkflowScriptStepContext has no workflow caller configured', + ); + } + return caller.waitForWorkflowRef( + runId, + definition, + pollInterval: pollInterval, + timeout: timeout, + ); + } } /// Enqueuer that prefixes workflow step metadata onto spawned tasks. @@ -1785,6 +2202,7 @@ class _WorkflowStepEnqueuer implements TaskEnqueuer { Map args = const {}, Map headers = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, Map meta = const {}, TaskEnqueueOptions? enqueueOptions, }) { @@ -1799,6 +2217,7 @@ class _WorkflowStepEnqueuer implements TaskEnqueuer { args: args, headers: headers, options: resolvedOptions, + notBefore: notBefore, meta: mergedMeta, enqueueOptions: enqueueOptions, ); @@ -1810,7 +2229,7 @@ class _WorkflowStepEnqueuer implements TaskEnqueuer { TaskEnqueueOptions? enqueueOptions, }) { final mergedMeta = Map.from(baseMeta)..addAll(call.meta); - TaskOptions? resolvedOptions = call.options; + var resolvedOptions = call.options; if (resolvedOptions == null) { final inherited = call.definition.defaultOptions; if (inherited.queue == 'default' && executionQueue != 'default') { @@ -1820,15 +2239,162 @@ class _WorkflowStepEnqueuer implements TaskEnqueuer { executionQueue != 'default') { resolvedOptions = resolvedOptions.copyWith(queue: executionQueue); } - final mergedCall = call.copyWith( + final mergedCall = call.definition.buildCall( + call.args, + headers: call.headers, options: resolvedOptions, + notBefore: call.notBefore, meta: Map.unmodifiable(mergedMeta), + enqueueOptions: call.enqueueOptions, ); return delegate.enqueueCall( mergedCall, enqueueOptions: enqueueOptions, ); } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeWorkflowStepValue(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } +} + +Map _encodeWorkflowStepValue( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + final normalized = {}; + for (final entry in payload.entries) { + final key = entry.key; + if (key is! String) { + throw StateError( + 'Task payload for $name must use string keys, got ' + '${key.runtimeType}.', + ); + } + normalized[key] = entry.value; + } + return normalized; + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); +} + +class _ChildWorkflowCaller implements WorkflowCaller { + const _ChildWorkflowCaller({ + required this.runtime, + required this.parentRunId, + }); + + final WorkflowRuntime runtime; + final String parentRunId; + + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) { + return runtime.startWorkflowRef( + definition, + params, + parentRunId: this.parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) { + return runtime.startWorkflowCall( + call.definition.buildStart( + params: call.params, + parentRunId: parentRunId, + ttl: call.ttl, + cancellationPolicy: call.cancellationPolicy, + ), + ); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) { + return _waitForChildWorkflow( + runId, + definition, + pollInterval: pollInterval, + timeout: timeout, + ); + } + + Future?> + _waitForChildWorkflow( + String runId, + WorkflowRef definition, { + required Duration pollInterval, + required Duration? timeout, + }) async { + final startedAt = runtime.clock.now(); + while (true) { + final state = await runtime._store.get(runId); + if (state == null) { + return null; + } + if (state.isTerminal) { + return runtime._buildResult( + state, + definition.decode, + timedOut: false, + ); + } + if (timeout != null && + runtime.clock.now().difference(startedAt) >= timeout) { + return runtime._buildResult( + state, + definition.decode, + timedOut: true, + ); + } + await runtime.executeRun(runId); + if (pollInterval > Duration.zero) { + await Future.delayed(pollInterval); + } + } + } } Map _coerceEventPayload(String topic, Object? payload) { diff --git a/packages/stem/lib/src/workflow/runtime/workflow_views.dart b/packages/stem/lib/src/workflow/runtime/workflow_views.dart index 1a4f1075..5b46520a 100644 --- a/packages/stem/lib/src/workflow/runtime/workflow_views.dart +++ b/packages/stem/lib/src/workflow/runtime/workflow_views.dart @@ -1,3 +1,4 @@ +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/run_state.dart'; import 'package:stem/src/workflow/core/workflow_status.dart'; import 'package:stem/src/workflow/core/workflow_step_entry.dart'; @@ -11,11 +12,11 @@ class WorkflowRunView { required this.status, required this.cursor, required this.createdAt, + required this.params, + required this.runtime, this.updatedAt, this.result, this.lastError, - required this.params, - required this.runtime, this.suspensionData, }); @@ -57,18 +58,206 @@ class WorkflowRunView { /// Final result payload when completed. final Object? result; + /// Decodes the final result payload with [codec]. + TResult? resultAs({required PayloadCodec codec}) { + final stored = result; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the final result payload with a JSON decoder. + TResult? resultJson({ + required TResult Function(Map payload) decode, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the final result payload with a version-aware JSON decoder. + TResult? resultVersionedJson({ + required int version, + required TResult Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = result; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Last error payload, if present. final Map? lastError; + /// Decodes the last error payload with [codec], when present. + TError? lastErrorAs({required PayloadCodec codec}) { + final payload = lastError; + if (payload == null) return null; + return codec.decode(payload); + } + + /// Decodes the last error payload with a JSON decoder, when present. + TError? lastErrorJson({ + required TError Function(Map payload) decode, + String? typeName, + }) { + final payload = lastError; + if (payload == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(payload); + } + + /// Decodes the last error payload with a version-aware JSON decoder, when + /// present. + TError? lastErrorVersionedJson({ + required int version, + required TError Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final payload = lastError; + if (payload == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(payload); + } + /// Public user-supplied workflow params. final Map params; + /// Decodes the workflow params payload with [codec]. + TParams paramsAs({required PayloadCodec codec}) { + return codec.decode(params); + } + + /// Decodes the workflow params payload with a JSON decoder. + TParams paramsJson({ + required TParams Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(params); + } + + /// Decodes the workflow params payload with a version-aware JSON decoder. + TParams paramsVersionedJson({ + required int version, + required TParams Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(params); + } + /// Run-scoped runtime metadata (queues/channel/serialization framing). final Map runtime; + /// Decodes the runtime metadata payload with [codec]. + TRuntime runtimeAs({required PayloadCodec codec}) { + return codec.decode(runtime); + } + + /// Decodes the runtime metadata payload with a JSON decoder. + TRuntime runtimeJson({ + required TRuntime Function(Map payload) decode, + String? typeName, + }) { + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(runtime); + } + + /// Decodes the runtime metadata payload with a version-aware JSON decoder. + TRuntime runtimeVersionedJson({ + required int version, + required TRuntime Function( + Map payload, + int version, + ) + decode, + int? defaultDecodeVersion, + String? typeName, + }) { + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(runtime); + } + /// Suspension payload, if run is suspended. final Map? suspensionData; + /// Resume payload delivered to the suspended run, when present. + Object? get suspensionPayload => suspensionData?['payload']; + + /// Decodes the suspension payload with [codec], when present. + TPayload? suspensionPayloadAs({ + required PayloadCodec codec, + }) { + final stored = suspensionPayload; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the suspension payload with a JSON decoder, when present. + TPayload? suspensionPayloadJson({ + required TPayload Function(Map payload) decode, + String? typeName, + }) { + final stored = suspensionPayload; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the suspension payload with a version-aware JSON decoder, when + /// present. + TPayload? suspensionPayloadVersionedJson({ + required int version, + required TPayload Function( + Map payload, + int version, + ) + decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = suspensionPayload; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Serializes this view into JSON. Map toJson() { return { @@ -87,31 +276,31 @@ class WorkflowRunView { } } -/// Uniform workflow checkpoint view for dashboard/CLI step drilldowns. -class WorkflowStepView { - /// Creates an immutable step view. - const WorkflowStepView({ +/// Uniform workflow checkpoint view for dashboard/CLI drilldowns. +class WorkflowCheckpointView { + /// Creates an immutable checkpoint view. + const WorkflowCheckpointView({ required this.runId, required this.workflow, - required this.stepName, - required this.baseStepName, - this.iteration, + required this.checkpointName, + required this.baseCheckpointName, required this.position, + this.iteration, this.completedAt, this.value, }); - /// Creates a step view from a [WorkflowStepEntry]. - factory WorkflowStepView.fromEntry({ + /// Creates a checkpoint view from a [WorkflowStepEntry]. + factory WorkflowCheckpointView.fromEntry({ required String runId, required String workflow, required WorkflowStepEntry entry, }) { - return WorkflowStepView( + return WorkflowCheckpointView( runId: runId, workflow: workflow, - stepName: entry.name, - baseStepName: entry.baseName, + checkpointName: entry.name, + baseCheckpointName: entry.baseName, iteration: entry.iteration, position: entry.position, completedAt: entry.completedAt, @@ -126,10 +315,10 @@ class WorkflowStepView { final String workflow; /// Persisted checkpoint name. - final String stepName; + final String checkpointName; /// Base step name without iteration suffix. - final String baseStepName; + final String baseCheckpointName; /// Optional iteration suffix. final int? iteration; @@ -143,13 +332,50 @@ class WorkflowStepView { /// Persisted checkpoint value. final Object? value; + /// Decodes the persisted checkpoint value with [codec]. + TValue? valueAs({required PayloadCodec codec}) { + final stored = value; + if (stored == null) return null; + return codec.decode(stored); + } + + /// Decodes the persisted checkpoint value with a JSON decoder. + TValue? valueJson({ + required TValue Function(Map payload) decode, + String? typeName, + }) { + final stored = value; + if (stored == null) return null; + return PayloadCodec.json( + decode: decode, + typeName: typeName, + ).decode(stored); + } + + /// Decodes the persisted checkpoint value with a version-aware JSON decoder. + TValue? valueVersionedJson({ + required int version, + required TValue Function(Map payload, int version) decode, + int? defaultDecodeVersion, + String? typeName, + }) { + final stored = value; + if (stored == null) return null; + return PayloadCodec.versionedJson( + version: version, + decode: decode, + defaultDecodeVersion: defaultDecodeVersion, + typeName: typeName, + ).decode(stored); + } + /// Serializes this view into JSON. Map toJson() { return { 'runId': runId, 'workflow': workflow, - 'stepName': stepName, - 'baseStepName': baseStepName, + 'checkpointName': checkpointName, + 'baseCheckpointName': baseCheckpointName, if (iteration != null) 'iteration': iteration, 'position': position, if (completedAt != null) 'completedAt': completedAt!.toIso8601String(), @@ -158,20 +384,20 @@ class WorkflowStepView { } } -/// Combined run + step drilldown view. +/// Combined run + checkpoint drilldown view. class WorkflowRunDetailView { /// Creates an immutable run detail view. - const WorkflowRunDetailView({required this.run, required this.steps}); + const WorkflowRunDetailView({required this.run, required this.checkpoints}); /// Run summary view. final WorkflowRunView run; - /// Persisted step views. - final List steps; + /// Persisted checkpoint views. + final List checkpoints; /// Serializes this detail view into JSON. Map toJson() => { 'run': run.toJson(), - 'steps': steps.map((step) => step.toJson()).toList(), + 'checkpoints': checkpoints.map((step) => step.toJson()).toList(), }; } diff --git a/packages/stem/lib/src/workflow/workflow.dart b/packages/stem/lib/src/workflow/workflow.dart index 15d1f11c..b1d1931a 100644 --- a/packages/stem/lib/src/workflow/workflow.dart +++ b/packages/stem/lib/src/workflow/workflow.dart @@ -7,12 +7,16 @@ export 'core/flow_context.dart'; export 'core/flow_step.dart'; export 'core/run_state.dart'; export 'core/workflow_cancellation_policy.dart'; +export 'core/workflow_checkpoint.dart'; export 'core/workflow_clock.dart'; export 'core/workflow_definition.dart'; +export 'core/workflow_event_ref.dart'; +export 'core/workflow_execution_context.dart'; export 'core/workflow_ref.dart'; export 'core/workflow_result.dart'; -export 'core/workflow_runtime_metadata.dart'; export 'core/workflow_resume.dart'; +export 'core/workflow_resume_context.dart'; +export 'core/workflow_runtime_metadata.dart'; export 'core/workflow_script.dart'; export 'core/workflow_script_context.dart'; export 'core/workflow_status.dart'; diff --git a/packages/stem/lib/stem.dart b/packages/stem/lib/stem.dart index 7fd64d28..b0cd219c 100644 --- a/packages/stem/lib/stem.dart +++ b/packages/stem/lib/stem.dart @@ -59,7 +59,7 @@ /// final taskId = await addDefinition.call({ /// 'a': 10, /// 'b': 20, -/// }).enqueueWith(stem); +/// }).enqueue(stem); /// final result = await stem.waitForTask(taskId); /// /// print('Sum is: ${result?.value}'); // Sum is: 30 @@ -67,13 +67,12 @@ /// ``` library; -export 'package:contextual/contextual.dart' show Context, Level, Logger; - import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/stem.dart'; import 'package:stem/src/scheduler/beat.dart'; import 'package:stem/src/worker/worker.dart'; +export 'package:contextual/contextual.dart' show Context, Level, Logger; export 'package:stem_memory/stem_memory.dart' show InMemoryBroker, @@ -100,6 +99,7 @@ export 'src/core/encoder_keys.dart'; export 'src/core/envelope.dart'; export 'src/core/function_task_handler.dart'; export 'src/core/payload_codec.dart'; +export 'src/core/payload_map.dart'; export 'src/core/queue_events.dart'; export 'src/core/retry.dart'; export 'src/core/stem.dart'; diff --git a/packages/stem/pubspec.yaml b/packages/stem/pubspec.yaml index a7851561..cd3ff58b 100644 --- a/packages/stem/pubspec.yaml +++ b/packages/stem/pubspec.yaml @@ -1,6 +1,6 @@ name: stem description: "Stem is a Dart-native background job platform with Redis Streams, retries, scheduling, observability, and security tooling." -version: 0.1.1 +version: 0.2.0 repository: https://github.com/kingwill101/stem resolution: workspace environment: @@ -8,6 +8,7 @@ environment: # Add regular dependencies here. dependencies: + ansicolor: ^2.0.3 collection: ^1.19.1 contextual: ^2.2.0 crypto: ^3.0.7 diff --git a/packages/stem/test/bootstrap/module_bootstrap_test.dart b/packages/stem/test/bootstrap/module_bootstrap_test.dart new file mode 100644 index 00000000..7aa44fd9 --- /dev/null +++ b/packages/stem/test/bootstrap/module_bootstrap_test.dart @@ -0,0 +1,444 @@ +import 'package:stem/stem.dart'; +import 'package:test/test.dart'; + +void main() { + group('StemModule.merge', () { + test('combine returns null, a single module, or a merged module', () { + final taskA = FunctionTaskHandler( + name: 'module.combine.task.a', + entrypoint: (context, args) async => 'a', + runInIsolate: false, + ); + final taskB = FunctionTaskHandler( + name: 'module.combine.task.b', + entrypoint: (context, args) async => 'b', + runInIsolate: false, + ); + final moduleA = StemModule(tasks: [taskA]); + final moduleB = StemModule(tasks: [taskB]); + + expect(StemModule.combine(), isNull); + expect(StemModule.combine(module: moduleA), same(moduleA)); + expect( + StemModule.combine(modules: [moduleA, moduleB])?.tasks, + [taskA, taskB], + ); + }); + + test('combines distinct task and workflow definitions', () async { + final taskA = FunctionTaskHandler( + name: 'module.merge.task.a', + entrypoint: (context, args) async => 'a', + runInIsolate: false, + ); + final taskB = FunctionTaskHandler( + name: 'module.merge.task.b', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'b', + runInIsolate: false, + ); + final flow = Flow( + name: 'module.merge.flow', + build: (builder) { + builder.step('hello', (ctx) async => 'ok'); + }, + ); + final merged = StemModule.merge([ + StemModule(tasks: [taskA]), + StemModule(flows: [flow], tasks: [taskB]), + ]); + + expect(merged.tasks, [taskA, taskB]); + expect( + merged.workflowDefinitions.map((definition) => definition.name), + ['module.merge.flow'], + ); + expect(merged.workflowManifest.map((entry) => entry.name), [ + 'module.merge.flow', + ]); + + final app = await StemWorkflowApp.inMemory(module: merged); + try { + await app.start(); + + final runId = await app.startWorkflow('module.merge.flow'); + final flowResult = await app.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(flowResult?.value, 'ok'); + } finally { + await app.close(); + } + }); + + test('deduplicates identical modules and manifest entries', () { + final flow = Flow( + name: 'module.merge.duplicate.flow', + build: (builder) { + builder.step('hello', (ctx) async => 'ok'); + }, + ); + final task = FunctionTaskHandler( + name: 'module.merge.duplicate.task', + entrypoint: (context, args) async => 'ok', + runInIsolate: false, + ); + final module = StemModule(flows: [flow], tasks: [task]); + + final merged = StemModule.merge([module, module]); + + expect(merged.tasks, [task]); + expect( + merged.workflowDefinitions.map((definition) => definition.name), + ['module.merge.duplicate.flow'], + ); + expect(merged.workflowManifest.map((entry) => entry.name), [ + 'module.merge.duplicate.flow', + ]); + }); + + test('fails fast on conflicting task or workflow names', () { + final taskA = FunctionTaskHandler( + name: 'module.merge.conflict.task', + entrypoint: (context, args) async => 'a', + runInIsolate: false, + ); + final taskB = FunctionTaskHandler( + name: 'module.merge.conflict.task', + entrypoint: (context, args) async => 'b', + runInIsolate: false, + ); + final flowA = Flow( + name: 'module.merge.conflict.workflow', + build: (builder) { + builder.step('hello', (ctx) async => 'a'); + }, + ); + final flowB = Flow( + name: 'module.merge.conflict.workflow', + build: (builder) { + builder.step('hello', (ctx) async => 'b'); + }, + ); + + expect( + () => StemModule.merge([ + StemModule(tasks: [taskA]), + StemModule(tasks: [taskB]), + ]), + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('module.merge.conflict.task'), + ), + ), + ); + expect( + () => StemModule.merge([ + StemModule(flows: [flowA]), + StemModule(flows: [flowB]), + ]), + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('module.merge.conflict.workflow'), + ), + ), + ); + }); + + test('exposes explicit queue inspection helpers', () { + final taskA = FunctionTaskHandler( + name: 'module.queues.task.a', + entrypoint: (context, args) async => 'a', + runInIsolate: false, + ); + final taskB = FunctionTaskHandler( + name: 'module.queues.task.b', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'b', + runInIsolate: false, + ); + final module = StemModule(tasks: [taskA, taskB]); + + expect(module.requiredTaskQueues(), ['default', 'priority']); + expect( + module.requiredTaskSubscription().queues, + ['default', 'priority'], + ); + expect( + module.requiredWorkflowQueues( + continuationQueue: 'workflow-continue', + executionQueue: 'workflow-step', + ), + [ + 'default', + 'priority', + 'workflow', + 'workflow-continue', + 'workflow-step', + ], + ); + expect( + module + .requiredWorkflowSubscription( + continuationQueue: 'workflow-continue', + executionQueue: 'workflow-step', + ) + .queues, + [ + 'default', + 'priority', + 'workflow', + 'workflow-continue', + 'workflow-step', + ], + ); + }); + }); + + group('module bootstrap', () { + test('StemApp.inMemory registers module tasks and infers queues', () async { + final moduleTask = FunctionTaskHandler( + name: 'module.bootstrap.task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final moduleDefinition = TaskDefinition.noArgs( + name: 'module.bootstrap.task', + defaultOptions: const TaskOptions(queue: 'priority'), + ); + + final app = await StemApp.inMemory( + module: StemModule(tasks: [moduleTask]), + ); + await app.start(); + try { + expect(app.registry.resolve('module.bootstrap.task'), same(moduleTask)); + expect(app.worker.subscription.queues, ['priority']); + + final result = await moduleDefinition.enqueueAndWait( + app, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'task-ok'); + } finally { + await app.close(); + } + }); + + test('StemClient.createApp reuses its default module', () async { + final moduleTask = FunctionTaskHandler( + name: 'module.client.task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final client = await StemClient.inMemory( + module: StemModule(tasks: [moduleTask]), + ); + + final app = await client.createApp(); + await app.start(); + try { + expect(app.registry.resolve('module.client.task'), same(moduleTask)); + expect(app.worker.subscription.queues, ['priority']); + } finally { + await app.close(); + await client.close(); + } + }); + + test('StemApp.inMemory merges plural modules during bootstrap', () async { + final taskA = FunctionTaskHandler( + name: 'module.bootstrap.modules.task.a', + entrypoint: (context, args) async => 'a', + runInIsolate: false, + ); + final taskB = FunctionTaskHandler( + name: 'module.bootstrap.modules.task.b', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'b', + runInIsolate: false, + ); + + final app = await StemApp.inMemory( + modules: [ + StemModule(tasks: [taskA]), + StemModule(tasks: [taskB]), + ], + ); + await app.start(); + try { + expect(app.registry.resolve(taskA.name), same(taskA)); + expect(app.registry.resolve(taskB.name), same(taskB)); + expect( + app.worker.subscription.queues, + unorderedEquals(['default', 'priority']), + ); + } finally { + await app.close(); + } + }); + + test('StemClient.createWorkflowApp reuses its default module', () async { + final moduleTask = FunctionTaskHandler( + name: 'module.client.workflow-task', + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final moduleFlow = Flow( + name: 'module.client.workflow', + build: (builder) { + builder.step('hello', (ctx) async => 'module-ok'); + }, + ); + final client = await StemClient.inMemory( + module: StemModule(flows: [moduleFlow], tasks: [moduleTask]), + ); + + final app = await client.createWorkflowApp(); + await app.start(); + try { + expect( + app.app.registry.resolve('module.client.workflow-task'), + same(moduleTask), + ); + + final runId = await app.startWorkflow('module.client.workflow'); + final result = await app.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'module-ok'); + } finally { + await app.close(); + await client.close(); + } + }); + + test('StemApp.createWorkflowApp reuses its default module', () async { + final moduleTask = FunctionTaskHandler( + name: 'module.app.workflow-task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final moduleFlow = Flow( + name: 'module.app.workflow', + build: (builder) { + builder.step('hello', (ctx) async => 'module-ok'); + }, + ); + final stemApp = await StemApp.inMemory( + module: StemModule(flows: [moduleFlow], tasks: [moduleTask]), + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription( + queues: ['workflow', 'priority'], + ), + ), + ); + + final workflowApp = await stemApp.createWorkflowApp(); + await workflowApp.start(); + try { + expect( + workflowApp.app.registry.resolve('module.app.workflow-task'), + same(moduleTask), + ); + + final runId = await workflowApp.startWorkflow('module.app.workflow'); + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'module-ok'); + } finally { + await workflowApp.close(); + } + }); + + test('StemApp.createWorkflowApp registers plural modules', () async { + final flow = Flow( + name: 'module.app.modules.workflow', + build: (builder) { + builder.step('hello', (ctx) async => 'module-ok'); + }, + ); + final task = FunctionTaskHandler( + name: 'module.app.modules.task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final stemApp = await StemApp.inMemory( + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription( + queues: ['workflow', 'priority'], + ), + ), + ); + + final workflowApp = await stemApp.createWorkflowApp( + modules: [ + StemModule(flows: [flow]), + StemModule(tasks: [task]), + ], + ); + await workflowApp.start(); + try { + expect(workflowApp.app.registry.resolve(task.name), same(task)); + + final runId = await workflowApp.startWorkflow(flow.definition.name); + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'module-ok'); + } finally { + await workflowApp.close(); + } + }); + + test( + 'StemWorkflowApp.create rejects reused StemApp without workflow queue ' + 'coverage', + () async { + final moduleFlow = Flow( + name: 'module.app.missing-workflow-queue', + build: (builder) { + builder.step('hello', (ctx) async => 'module-ok'); + }, + ); + final stemApp = await StemApp.inMemory( + module: StemModule(flows: [moduleFlow]), + ); + + try { + await expectLater( + stemApp.createWorkflowApp, + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('reused StemApp worker'), + ), + ), + ); + } finally { + await stemApp.close(); + } + }, + ); + }); +} diff --git a/packages/stem/test/bootstrap/shortcut_allow_auto_start_test.dart b/packages/stem/test/bootstrap/shortcut_allow_auto_start_test.dart new file mode 100644 index 00000000..ce95793f --- /dev/null +++ b/packages/stem/test/bootstrap/shortcut_allow_auto_start_test.dart @@ -0,0 +1,86 @@ +import 'package:stem/stem.dart'; +import 'package:test/test.dart'; + +void main() { + group('shortcut allowWorkerAutoStart', () { + test('StemApp can enqueue without starting the worker', () async { + final app = await StemApp.inMemory( + allowWorkerAutoStart: false, + tasks: [ + FunctionTaskHandler( + name: 'shortcut.echo', + entrypoint: (context, args) async => 'done', + ), + ], + ); + + try { + final taskId = await app.enqueue('shortcut.echo'); + expect(app.isStarted, isFalse); + + final pending = await app.waitForTask( + taskId, + timeout: const Duration(milliseconds: 10), + ); + expect(pending, isNotNull); + expect(pending!.timedOut, isTrue); + expect(pending.status.state, TaskState.queued); + + await app.start(); + expect(app.isStarted, isTrue); + + final completed = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 1), + ); + expect(completed?.isSucceeded, isTrue); + expect(completed?.value, 'done'); + } finally { + await app.shutdown(); + } + }); + + test('StemWorkflowApp can create runs without starting the worker', () async { + final flow = Flow( + name: 'shortcut.workflow', + build: (builder) { + builder.step('done', (context) async => 'workflow-done'); + }, + ); + + final app = await StemWorkflowApp.inMemory( + flows: [flow], + allowWorkerAutoStart: false, + ); + + try { + final runId = await flow.start(app); + expect(app.isRuntimeStarted, isTrue); + expect(app.isWorkerStarted, isFalse); + + final pending = await app.waitForCompletion( + runId, + timeout: const Duration(milliseconds: 10), + ); + expect(pending, isNotNull); + expect(pending!.timedOut, isTrue); + expect(pending.status, WorkflowStatus.running); + + await app.startWorker(); + expect(app.isRuntimeStarted, isTrue); + expect(app.isWorkerStarted, isTrue); + expect(app.isStarted, isTrue); + + final completed = await flow.waitFor( + app, + runId, + timeout: const Duration(seconds: 1), + ); + expect(completed?.isCompleted, isTrue); + expect(completed?.value, 'workflow-done'); + } finally { + await app.shutdown(); + } + }); + }); +} diff --git a/packages/stem/test/bootstrap/stem_app_test.dart b/packages/stem/test/bootstrap/stem_app_test.dart index d6c977e9..74c1ddd6 100644 --- a/packages/stem/test/bootstrap/stem_app_test.dart +++ b/packages/stem/test/bootstrap/stem_app_test.dart @@ -14,9 +14,7 @@ void main() { final app = await StemApp.inMemory(tasks: [handler]); try { - await app.start(); - - final taskId = await app.stem.enqueue('test.echo'); + final taskId = await app.enqueue('test.echo'); final completed = await app.backend .watch(taskId) .firstWhere((status) => status.state == TaskState.succeeded) @@ -27,6 +25,191 @@ void main() { } }); + test('inMemory lazy-starts on first enqueue', () async { + final handler = FunctionTaskHandler( + name: 'test.lazy-start', + entrypoint: (context, args) async => 'started', + runInIsolate: false, + ); + + final app = await StemApp.inMemory(tasks: [handler]); + try { + final taskId = await app.enqueue('test.lazy-start'); + final completed = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + expect(completed?.value, 'started'); + } finally { + await app.shutdown(); + } + }); + + test('inMemory exposes task and group status helpers', () async { + final taskHandler = FunctionTaskHandler( + name: 'test.status.task', + entrypoint: (context, args) async => 'status-ok', + runInIsolate: false, + ); + + final app = await StemApp.inMemory(tasks: [taskHandler]); + try { + final taskId = await app.enqueue('test.status.task'); + final taskStatus = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + expect(taskStatus?.value, 'status-ok'); + expect((await app.getTaskStatus(taskId))?.state, TaskState.succeeded); + + final dispatch = await app.canvas.group([ + task('test.status.task'), + ]); + try { + final groupStatus = await _waitForGroupStatus( + () => app.getGroupStatus(dispatch.groupId), + ); + expect(groupStatus?.completed, 1); + } finally { + await dispatch.dispose(); + } + } finally { + await app.shutdown(); + } + }); + + test( + 'inMemory registers module tasks and infers queued subscriptions', + () async { + final handler = FunctionTaskHandler( + name: 'test.module.queue', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'module-ok', + runInIsolate: false, + ); + + final app = await StemApp.inMemory( + module: StemModule(tasks: [handler]), + ); + try { + expect(app.registry.resolve('test.module.queue'), same(handler)); + expect(app.worker.subscription.queues, ['priority']); + + final taskId = await app.enqueue( + 'test.module.queue', + enqueueOptions: const TaskEnqueueOptions(queue: 'priority'), + ); + final completed = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + expect(completed?.value, 'module-ok'); + } finally { + await app.shutdown(); + } + }, + ); + + test('inMemory infers queued subscriptions from explicit tasks', () async { + final handler = FunctionTaskHandler( + name: 'test.explicit.queue', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'explicit-ok', + runInIsolate: false, + ); + + final app = await StemApp.inMemory(tasks: [handler]); + try { + expect(app.worker.subscription.queues, ['priority']); + + final taskId = await app.enqueue( + 'test.explicit.queue', + enqueueOptions: const TaskEnqueueOptions(queue: 'priority'), + ); + final completed = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + expect(completed?.value, 'explicit-ok'); + } finally { + await app.shutdown(); + } + }); + + test('inMemory lazy-starts for canvas dispatch', () async { + final handler = FunctionTaskHandler( + name: 'test.canvas.double', + entrypoint: (context, args) async { + final value = args['value'] as int? ?? 0; + return value * 2; + }, + runInIsolate: false, + ); + + final app = await StemApp.inMemory(tasks: [handler]); + try { + final result = await app.canvas.chain([ + task('test.canvas.double', args: {'value': 21}), + ]); + + expect(result.isCompleted, isTrue); + expect(result.value, 42); + } finally { + await app.shutdown(); + } + }); + + test('StemApp exposes task registration helpers', () async { + final directHandler = FunctionTaskHandler( + name: 'test.register.direct', + entrypoint: (context, args) async => 'direct-ok', + runInIsolate: false, + ); + final moduleHandler = FunctionTaskHandler( + name: 'test.register.module', + entrypoint: (context, args) async => 'module-ok', + runInIsolate: false, + ); + final extraHandler = FunctionTaskHandler( + name: 'test.register.extra', + entrypoint: (context, args) async => 'extra-ok', + runInIsolate: false, + ); + + final app = await StemApp.inMemory(); + try { + app + ..registerTask(directHandler) + ..registerModule(StemModule(tasks: [moduleHandler])) + ..registerModules([ + StemModule(tasks: [extraHandler]), + ]); + + final directTaskId = await app.enqueue('test.register.direct'); + final directResult = await app.waitForTask( + directTaskId, + timeout: const Duration(seconds: 2), + ); + expect(directResult?.value, 'direct-ok'); + + final moduleTaskId = await app.enqueue('test.register.module'); + final moduleResult = await app.waitForTask( + moduleTaskId, + timeout: const Duration(seconds: 2), + ); + expect(moduleResult?.value, 'module-ok'); + + final extraTaskId = await app.enqueue('test.register.extra'); + final extraResult = await app.waitForTask( + extraTaskId, + timeout: const Duration(seconds: 2), + ); + expect(extraResult?.value, 'extra-ok'); + } finally { + await app.shutdown(); + } + }); + test('inMemory applies worker config overrides', () async { final handler = FunctionTaskHandler( name: 'test.worker-config', @@ -142,8 +325,7 @@ void main() { tasks: [handler], ); try { - await app.start(); - final taskId = await app.stem.enqueue('test.from-url'); + final taskId = await app.enqueue('test.from-url'); final completed = await app.backend .watch(taskId) .firstWhere((status) => status.state == TaskState.succeeded) @@ -284,8 +466,7 @@ void main() { final runId = await workflowApp.startWorkflow('workflow.typed'); final run = await workflowApp.waitForCompletion<_DemoPayload>( runId, - decode: (payload) => - _DemoPayload.fromJson(payload! as Map), + decodeJson: _DemoPayload.fromJson, ); expect(run, isNotNull); @@ -297,6 +478,251 @@ void main() { } }); + test( + 'waitForCompletion decodes versioned custom types on success', + () async { + final flow = Flow>( + name: 'workflow.typed.versioned', + build: (builder) { + builder.step( + 'payload', + (ctx) async => { + PayloadCodec.versionKey: 2, + 'foo': 'bar', + }, + ); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow( + 'workflow.typed.versioned', + ); + final run = await workflowApp.waitForCompletion<_DemoPayload>( + runId, + decodeVersionedJson: _DemoPayload.fromVersionedJson, + ); + + expect(run, isNotNull); + expect(run!.value, isA<_DemoPayload>()); + expect(run.value!.foo, 'bar-v2'); + expect(run.state.result, { + PayloadCodec.versionKey: 2, + 'foo': 'bar', + }); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test('startWorkflowJson encodes DTO params without a manual map', () async { + final flow = Flow( + name: 'workflow.json.start', + build: (builder) { + builder.step( + 'payload', + (ctx) async => ctx.requiredParam('foo'), + ); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflowJson( + 'workflow.json.start', + const _DemoPayload('bar'), + ); + final run = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(runId, isNotEmpty); + expect(run?.requiredValue(), 'bar'); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'startWorkflowValue encodes typed params through the supplied codec', + () async { + final flow = Flow( + name: 'workflow.codec.start', + build: (builder) { + builder.step( + 'payload', + (ctx) async => + '${ctx.requiredParam('foo')}:' + '${ctx.requiredParam('kind')}', + ); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflowValue( + 'workflow.codec.start', + const _DemoPayload('bar'), + codec: const PayloadCodec<_DemoPayload>.map( + encode: _encodeDemoPayloadMap, + decode: _DemoPayload.fromJson, + typeName: '_DemoPayload', + ), + ); + final runState = await workflowApp.getRun(runId); + final run = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(runId, isNotEmpty); + expect(runState?.params, containsPair('foo', 'bar')); + expect(runState?.params, containsPair('kind', 'custom')); + expect(run?.requiredValue(), 'bar:custom'); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'startWorkflowVersionedJson encodes DTO params with a persisted ' + 'schema version', + () async { + final flow = Flow( + name: 'workflow.versioned.json.start', + build: (builder) { + builder.step( + 'payload', + (ctx) async => ctx.requiredParam('foo'), + ); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflowVersionedJson( + 'workflow.versioned.json.start', + const _DemoPayload('bar'), + version: 2, + ); + final runState = await workflowApp.getRun(runId); + final run = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(runId, isNotEmpty); + expect(runState?.params, containsPair(PayloadCodec.versionKey, 2)); + expect(runState?.params, containsPair('foo', 'bar')); + expect(run?.requiredValue(), 'bar'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'emitJson resumes runs with DTO payloads without a manual map', + () async { + const demoPayloadCodec = PayloadCodec<_DemoPayload>.json( + decode: _DemoPayload.fromJson, + ); + final flow = Flow( + name: 'workflow.json.emit', + build: (builder) { + builder.step( + 'wait', + (ctx) async { + final resume = ctx.takeResumeValue<_DemoPayload>( + codec: demoPayloadCodec, + ); + if (resume == null) { + ctx.awaitEvent('workflow.json.emit.topic'); + return null; + } + return resume.foo; + }, + ); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow('workflow.json.emit'); + await workflowApp.executeRun(runId); + + await workflowApp.emitJson( + 'workflow.json.emit.topic', + const _DemoPayload('baz'), + ); + + final run = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(runId, isNotEmpty); + expect(run?.requiredValue(), 'baz'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'emitVersionedJson resumes runs with versioned DTO payloads', + () async { + const demoPayloadCodec = PayloadCodec<_DemoPayload>.json( + decode: _DemoPayload.fromJson, + ); + final flow = Flow( + name: 'workflow.versioned.json.emit', + build: (builder) { + builder.step( + 'wait', + (ctx) async { + final resume = ctx.takeResumeValue<_DemoPayload>( + codec: demoPayloadCodec, + ); + if (resume == null) { + ctx.awaitEvent('workflow.versioned.json.emit.topic'); + return null; + } + return resume.foo; + }, + ); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow( + 'workflow.versioned.json.emit', + ); + await workflowApp.executeRun(runId); + + await workflowApp.emitVersionedJson( + 'workflow.versioned.json.emit.topic', + const _DemoPayload('qux'), + version: 2, + ); + + final run = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(runId, isNotEmpty); + expect(run?.requiredValue(), 'qux'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + test( 'waitForCompletion does not decode when workflow is cancelled', () async { @@ -464,6 +890,61 @@ void main() { } }); + test( + 'inMemory infers worker subscription from module task queues', + () async { + final helperTask = FunctionTaskHandler( + name: 'workflow.module.queue-helper', + entrypoint: (context, args) async => 'queued-ok', + runInIsolate: false, + ); + final helperDefinition = TaskDefinition.noArgs( + name: 'workflow.module.queue-helper', + ); + final workflowApp = await StemWorkflowApp.inMemory( + module: StemModule(tasks: [helperTask]), + ); + try { + expect( + workflowApp.app.worker.subscription.queues, + unorderedEquals(['workflow', 'default']), + ); + + await workflowApp.start(); + final result = await helperDefinition.enqueueAndWait( + workflowApp, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'queued-ok'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'explicit workflow subscription overrides inferred module queues', + () async { + final helperTask = FunctionTaskHandler( + name: 'workflow.module.explicit-subscription', + entrypoint: (context, args) async => 'ignored', + runInIsolate: false, + ); + final workflowApp = await StemWorkflowApp.inMemory( + module: StemModule(tasks: [helperTask]), + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription.singleQueue('workflow'), + ), + ); + try { + expect(workflowApp.app.worker.subscription.queues, ['workflow']); + } finally { + await workflowApp.shutdown(); + } + }, + ); + test('workflow refs start and decode runs through app helpers', () async { final moduleFlow = Flow( name: 'workflow.ref.flow', @@ -474,17 +955,17 @@ void main() { }); }, ); - final workflowRef = - WorkflowRef, String>( - name: 'workflow.ref.flow', - encodeParams: (params) => params, - ); + final workflowRef = WorkflowRef, String>( + name: 'workflow.ref.flow', + encodeParams: (params) => params, + ); final workflowApp = await StemWorkflowApp.inMemory(flows: [moduleFlow]); try { - final runId = await workflowRef.call( - const {'name': 'stem'}, - ).startWithApp(workflowApp); + final runId = await workflowRef.start( + workflowApp, + params: const {'name': 'stem'}, + ); final result = await workflowRef.waitFor( workflowApp, runId, @@ -497,6 +978,400 @@ void main() { } }); + test('StemWorkflowApp exposes run detail helper', () async { + final flow = Flow( + name: 'workflow.detail.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'detail-ok'); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow('workflow.detail.helper'); + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'detail-ok'); + + final detail = await workflowApp.viewRunDetail(runId); + expect(detail, isNotNull); + expect(detail!.run.runId, equals(runId)); + } finally { + await workflowApp.shutdown(); + } + }); + + test('StemWorkflowApp exposes workflow manifest helper', () async { + final flow = Flow( + name: 'workflow.manifest.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'manifest-ok'); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final manifest = workflowApp.workflowManifest(); + final entry = manifest.singleWhere( + (item) => item.name == 'workflow.manifest.helper', + ); + expect(entry.kind, equals(WorkflowDefinitionKind.flow)); + expect(entry.steps.single.name, equals('hello')); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'StemWorkflowApp registers module definitions after bootstrap', + () async { + final taskHandler = FunctionTaskHandler.inline( + name: 'workflow.module.task', + entrypoint: (context, args) async => 'module-task-ok', + ); + final flow = Flow( + name: 'workflow.module.flow', + build: (builder) { + builder.step('hello', (ctx) async => 'module-flow-ok'); + }, + ); + final module = StemModule(flows: [flow], tasks: [taskHandler]); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp.registerModule(module); + + expect( + workflowApp.app.registry.resolve('workflow.module.task'), + isNotNull, + ); + expect( + workflowApp.runtime.registry.lookup('workflow.module.flow'), + isNotNull, + ); + + final runId = await workflowApp.startWorkflow('workflow.module.flow'); + final workflowResult = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(workflowResult?.value, equals('module-flow-ok')); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test('StemWorkflowApp exposes workflow registration helper', () async { + final flow = Flow( + name: 'workflow.register.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'register-ok'); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp.registerWorkflow(flow.definition); + + final runId = await workflowApp.startWorkflow( + 'workflow.register.helper', + ); + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, equals('register-ok')); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'StemWorkflowApp exposes flow and script registration helpers', + () async { + final flow = Flow( + name: 'workflow.register.flow.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'flow-register-ok'); + }, + ); + final script = WorkflowScript( + name: 'workflow.register.script.helper', + run: (script) => script.step( + 'hello', + (step) async => 'script-register-ok', + ), + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp + ..registerFlow(flow) + ..registerScript(script); + + final flowRunId = await workflowApp.startWorkflow( + 'workflow.register.flow.helper', + ); + final flowResult = await workflowApp.waitForCompletion( + flowRunId, + timeout: const Duration(seconds: 2), + ); + expect(flowResult?.value, equals('flow-register-ok')); + + final scriptRunId = await workflowApp.startWorkflow( + 'workflow.register.script.helper', + ); + final scriptResult = await workflowApp.waitForCompletion( + scriptRunId, + timeout: const Duration(seconds: 2), + ); + expect(scriptResult?.value, equals('script-register-ok')); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'StemWorkflowApp exposes bulk flow and script registration helpers', + () async { + final flow = Flow( + name: 'workflow.register.flows.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'flows-register-ok'); + }, + ); + final script = WorkflowScript( + name: 'workflow.register.scripts.helper', + run: (script) => script.step( + 'hello', + (step) async => 'scripts-register-ok', + ), + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp + ..registerFlows([flow]) + ..registerScripts([script]); + + final flowRunId = await workflowApp.startWorkflow( + 'workflow.register.flows.helper', + ); + final flowResult = await workflowApp.waitForCompletion( + flowRunId, + timeout: const Duration(seconds: 2), + ); + expect(flowResult?.value, equals('flows-register-ok')); + + final scriptRunId = await workflowApp.startWorkflow( + 'workflow.register.scripts.helper', + ); + final scriptResult = await workflowApp.waitForCompletion( + scriptRunId, + timeout: const Duration(seconds: 2), + ); + expect(scriptResult?.value, equals('scripts-register-ok')); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test('StemWorkflowApp exposes bulk workflow registration helper', () async { + final definition = WorkflowDefinition.flow( + name: 'workflow.register.definitions.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'definitions-register-ok'); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp.registerWorkflows([definition]); + + final runId = await workflowApp.startWorkflow( + 'workflow.register.definitions.helper', + ); + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, equals('definitions-register-ok')); + } finally { + await workflowApp.shutdown(); + } + }); + + test('StemWorkflowApp exposes run view helpers', () async { + final flow = Flow( + name: 'workflow.views.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'views-ok'); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow('workflow.views.helper'); + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'views-ok'); + + final runView = await workflowApp.viewRun(runId); + expect(runView, isNotNull); + expect(runView!.runId, equals(runId)); + + final checkpoints = await workflowApp.viewCheckpoints(runId); + expect(checkpoints, hasLength(1)); + expect(checkpoints.single.baseCheckpointName, equals('hello')); + + final runViews = await workflowApp.listRunViews( + workflow: 'workflow.views.helper', + ); + expect(runViews.map((view) => view.runId), contains(runId)); + } finally { + await workflowApp.shutdown(); + } + }); + + test('StemWorkflowApp exposes executeRun helper', () async { + final flow = Flow( + name: 'workflow.execute.helper', + build: (builder) { + builder.step('hello', (ctx) async => 'execute-ok'); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow( + 'workflow.execute.helper', + ); + await workflowApp.executeRun(runId); + + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'execute-ok'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('StemWorkflowApp exposes rewind helper', () async { + final iterations = []; + final flow = Flow( + name: 'workflow.rewind.helper', + build: (builder) { + builder + ..step('repeat', (ctx) async { + iterations.add(ctx.iteration); + return 'iteration-${ctx.iteration}'; + }, autoVersion: true) + ..step('tail', (ctx) async => ctx.previousResult! as String); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow('workflow.rewind.helper'); + await workflowApp.executeRun(runId); + + await workflowApp.rewindToCheckpoint(runId, 'repeat'); + await workflowApp.executeRun(runId); + + final checkpoints = await workflowApp.viewCheckpoints(runId); + expect( + checkpoints.map((checkpoint) => checkpoint.checkpointName), + containsAll(['repeat#0', 'tail']), + ); + expect(iterations, equals([0, 0])); + } finally { + await workflowApp.shutdown(); + } + }); + + test('StemWorkflowApp exposes watcher helper', () async { + final script = WorkflowScript( + name: 'workflow.watchers.helper', + run: (script) async { + final payload = await script.step('wait', (step) async { + await step.awaitEvent( + 'watchers.helper.topic', + deadline: DateTime.now().add(const Duration(minutes: 5)), + ); + return 'waiting'; + }); + return payload; + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(scripts: [script]); + try { + final runId = await workflowApp.startWorkflow( + 'workflow.watchers.helper', + ); + await workflowApp.executeRun(runId); + + final watchers = await workflowApp.listWatchers( + 'watchers.helper.topic', + ); + expect(watchers, hasLength(1)); + expect(watchers.single.runId, equals(runId)); + expect(watchers.single.stepName, equals('wait')); + } finally { + await workflowApp.shutdown(); + } + }); + + test('StemWorkflowApp exposes due-run resume helper', () async { + var iterations = 0; + final flow = Flow( + name: 'workflow.resume.due.helper', + build: (builder) { + builder.step('loop', (ctx) async { + iterations += 1; + if (iterations == 1) { + ctx.sleep(const Duration(milliseconds: 25)); + return 'waiting'; + } + return 'resumed'; + }); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + final runId = await workflowApp.startWorkflow( + 'workflow.resume.due.helper', + ); + await workflowApp.executeRun(runId); + + await Future.delayed(const Duration(milliseconds: 35)); + final resumed = await workflowApp.resumeDueRuns(DateTime.now()); + expect(resumed, contains(runId)); + + for (final id in resumed) { + await workflowApp.executeRun(id); + } + + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, equals('resumed')); + } finally { + await workflowApp.shutdown(); + } + }); + test( 'workflow codecs persist encoded checkpoints and decode typed results', () async { @@ -520,18 +1395,11 @@ void main() { ); }, ); - final workflowRef = - WorkflowRef, _DemoPayload>( - name: 'workflow.codec.flow', - encodeParams: (params) => params, - decodeResult: _demoPayloadCodec.decode, - ); + final workflowRef = flow.ref0(); final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); try { - final runId = await workflowRef.call(const {}).startWithApp( - workflowApp, - ); + final runId = await workflowRef.start(workflowApp); final result = await workflowRef.waitFor( workflowApp, runId, @@ -562,20 +1430,19 @@ void main() { ); test( - 'script workflow codecs persist encoded checkpoints and decode typed results', + 'script workflow codecs persist encoded checkpoints ' + 'and decode typed results', () async { final script = WorkflowScript<_DemoPayload>( name: 'workflow.codec.script', resultCodec: _demoPayloadCodec, checkpoints: [ - FlowStep.typed<_DemoPayload>( + WorkflowCheckpoint.typed<_DemoPayload>( name: 'build', - handler: (_) async => null, valueCodec: _demoPayloadCodec, ), - FlowStep.typed<_DemoPayload>( + WorkflowCheckpoint.typed<_DemoPayload>( name: 'finish', - handler: (_) async => null, valueCodec: _demoPayloadCodec, ), ], @@ -590,18 +1457,11 @@ void main() { ); }, ); - final workflowRef = - WorkflowRef, _DemoPayload>( - name: 'workflow.codec.script', - encodeParams: (params) => params, - decodeResult: _demoPayloadCodec.decode, - ); + final workflowRef = script.ref0(); final workflowApp = await StemWorkflowApp.inMemory(scripts: [script]); try { - final runId = await workflowRef.call(const {}).startWithApp( - workflowApp, - ); + final runId = await workflowRef.start(workflowApp); final result = await workflowRef.waitFor( workflowApp, runId, @@ -682,15 +1542,43 @@ void main() { }); } +Future _waitForGroupStatus( + Future Function() lookup, { + Duration timeout = const Duration(seconds: 2), + Duration pollInterval = const Duration(milliseconds: 25), +}) async { + final deadline = DateTime.now().add(timeout); + while (DateTime.now().isBefore(deadline)) { + final status = await lookup(); + if (status != null && status.completed == status.expected) { + return status; + } + await Future.delayed(pollInterval); + } + return lookup(); +} + class _DemoPayload { const _DemoPayload(this.foo); factory _DemoPayload.fromJson(Map json) => _DemoPayload(json['foo']! as String); + factory _DemoPayload.fromVersionedJson( + Map json, + int version, + ) => _DemoPayload('${json['foo']! as String}-v$version'); + final String foo; + + Map toJson() => {'foo': foo}; } +Object? _encodeDemoPayloadMap(_DemoPayload value) => { + 'foo': value.foo, + 'kind': 'custom', +}; + const _demoPayloadCodec = PayloadCodec<_DemoPayload>( encode: _encodeDemoPayload, decode: _decodeDemoPayload, diff --git a/packages/stem/test/bootstrap/stem_client_test.dart b/packages/stem/test/bootstrap/stem_client_test.dart index 6d17758d..a4a96b3b 100644 --- a/packages/stem/test/bootstrap/stem_client_test.dart +++ b/packages/stem/test/bootstrap/stem_client_test.dart @@ -60,6 +60,295 @@ void main() { await client.close(); }); + test( + 'StemClient remembers its default module for createApp', + () async { + final moduleTask = FunctionTaskHandler( + name: 'client.default-module.app-task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final client = await StemClient.inMemory( + module: StemModule(tasks: [moduleTask]), + ); + + final app = await client.createApp(); + + expect( + app.registry.resolve('client.default-module.app-task'), + same(moduleTask), + ); + expect(app.worker.subscription.queues, ['priority']); + + final taskId = await app.enqueue( + 'client.default-module.app-task', + enqueueOptions: const TaskEnqueueOptions(queue: 'priority'), + ); + final result = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'task-ok'); + + await app.close(); + await client.close(); + }, + ); + + test('StemClient createApp lazy-starts on first enqueue', () async { + final client = await StemClient.inMemory( + tasks: [ + FunctionTaskHandler( + name: 'client.lazy-start', + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ), + ], + ); + + final app = await client.createApp(); + + final taskId = await app.enqueue('client.lazy-start'); + final result = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'task-ok'); + + await app.close(); + await client.close(); + }); + + test('StemClient exposes task and group status helpers', () async { + final taskHandler = FunctionTaskHandler( + name: 'client.status.task', + entrypoint: (context, args) async => 'status-ok', + runInIsolate: false, + ); + final client = await StemClient.inMemory(tasks: [taskHandler]); + final worker = await client.createWorker(); + await worker.start(); + + final taskId = await client.enqueue('client.status.task'); + final taskStatus = await client.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + expect(taskStatus?.value, 'status-ok'); + expect((await client.getTaskStatus(taskId))?.state, TaskState.succeeded); + + final dispatch = await client.createCanvas().group([ + task('client.status.task'), + ]); + try { + final groupStatus = await _waitForClientGroupStatus( + () => client.getGroupStatus(dispatch.groupId), + ); + expect(groupStatus?.completed, 1); + } finally { + await dispatch.dispose(); + await worker.shutdown(); + await client.close(); + } + }); + + test('StemClient createCanvas reuses shared registry and backend', () async { + final client = await StemClient.inMemory( + tasks: [ + FunctionTaskHandler( + name: 'client.canvas.task', + entrypoint: (context, args) async => 'canvas-ok', + runInIsolate: false, + ), + ], + ); + final worker = await client.createWorker(); + await worker.start(); + + final canvas = client.createCanvas(); + final taskId = await canvas.send( + task('client.canvas.task'), + ); + final result = await client.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'canvas-ok'); + + await worker.shutdown(); + await client.close(); + }); + + test('StemClient createCanvas registers additional tasks', () async { + final extraTask = FunctionTaskHandler( + name: 'client.canvas.extra', + entrypoint: (context, args) async => 'extra-ok', + runInIsolate: false, + ); + final client = await StemClient.inMemory(); + final worker = await client.createWorker(tasks: [extraTask]); + await worker.start(); + + final canvas = client.createCanvas(tasks: [extraTask]); + final taskId = await canvas.send( + task('client.canvas.extra'), + ); + final result = await client.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'extra-ok'); + + await worker.shutdown(); + await client.close(); + }); + + test('StemClient createApp infers queues from explicit tasks', () async { + final client = await StemClient.inMemory(); + final app = await client.createApp( + tasks: [ + FunctionTaskHandler( + name: 'client.explicit.queue', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ), + ], + ); + + expect(app.worker.subscription.queues, ['priority']); + + final taskId = await app.enqueue( + 'client.explicit.queue', + enqueueOptions: const TaskEnqueueOptions(queue: 'priority'), + ); + final result = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'task-ok'); + + await app.close(); + await client.close(); + }); + + test( + 'StemClient createApp registers module tasks and infers queues', + () async { + final client = await StemClient.inMemory(); + final moduleTask = FunctionTaskHandler( + name: 'client.module.app-task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + + final app = await client.createApp( + module: StemModule(tasks: [moduleTask]), + ); + + expect(app.registry.resolve('client.module.app-task'), same(moduleTask)); + expect(app.worker.subscription.queues, ['priority']); + + final taskId = await app.enqueue( + 'client.module.app-task', + enqueueOptions: const TaskEnqueueOptions(queue: 'priority'), + ); + final result = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'task-ok'); + + await app.close(); + await client.close(); + }, + ); + + test( + 'StemClient createWorkflowApp infers module task queue subscriptions', + () async { + final client = await StemClient.inMemory(); + final moduleTask = FunctionTaskHandler( + name: 'client.module.queued-task', + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final taskDefinition = TaskDefinition.noArgs( + name: 'client.module.queued-task', + ); + final app = await client.createWorkflowApp( + module: StemModule(tasks: [moduleTask]), + ); + + expect( + app.app.worker.subscription.queues, + unorderedEquals(['workflow', 'default']), + ); + + await app.start(); + final result = await taskDefinition.enqueueAndWait( + app, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'task-ok'); + + await app.close(); + await client.close(); + }, + ); + + test( + 'StemClient remembers its default module for createWorkflowApp', + () async { + final moduleTask = FunctionTaskHandler( + name: 'client.default-module.workflow-task', + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final moduleFlow = Flow( + name: 'client.default-module.workflow', + build: (builder) { + builder.step('hello', (ctx) async => 'module-ok'); + }, + ); + final client = await StemClient.inMemory( + module: StemModule(flows: [moduleFlow], tasks: [moduleTask]), + ); + + final app = await client.createWorkflowApp(); + await app.start(); + + expect( + app.app.registry.resolve('client.default-module.workflow-task'), + same(moduleTask), + ); + expect( + app.app.worker.subscription.queues, + unorderedEquals(['workflow', 'default']), + ); + + final runId = await app.startWorkflow('client.default-module.workflow'); + final result = await app.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'module-ok'); + + await app.close(); + await client.close(); + }, + ); + test('StemClient workflow app supports typed workflow refs', () async { final client = await StemClient.inMemory(); final flow = Flow( @@ -79,8 +368,9 @@ void main() { final app = await client.createWorkflowApp(flows: [flow]); await app.start(); - final runId = await app.startWorkflowCall( - workflowRef.call(const {'name': 'ref'}), + final runId = await workflowRef.start( + app, + params: const {'name': 'ref'}, ); final result = await app.waitForWorkflowRef( runId, @@ -94,7 +384,48 @@ void main() { await client.close(); }); - test('StemClient workflow app supports startAndWaitWithApp', () async { + test('StemClient.inMemory merges plural default modules', () async { + final flow = Flow( + name: 'client.modules.workflow', + build: (builder) { + builder.step('hello', (ctx) async => 'module-ok'); + }, + ); + final task = FunctionTaskHandler( + name: 'client.modules.task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final client = await StemClient.inMemory( + modules: [ + StemModule(flows: [flow]), + StemModule(tasks: [task]), + ], + ); + + final app = await client.createWorkflowApp(); + await app.start(); + + expect(app.app.registry.resolve(task.name), same(task)); + expect( + app.app.worker.subscription.queues, + unorderedEquals(['workflow', 'priority']), + ); + + final runId = await app.startWorkflow(flow.definition.name); + final result = await app.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'module-ok'); + + await app.close(); + await client.close(); + }); + + test('StemClient workflow app supports startAndWaitWith', () async { final client = await StemClient.inMemory(); final flow = Flow( name: 'client.workflow.start-and-wait', @@ -113,9 +444,11 @@ void main() { final app = await client.createWorkflowApp(flows: [flow]); await app.start(); - final result = await workflowRef.call( - const {'name': 'one-shot'}, - ).startAndWaitWithApp(app, timeout: const Duration(seconds: 2)); + final result = await workflowRef.startAndWait( + app, + params: const {'name': 'one-shot'}, + timeout: const Duration(seconds: 2), + ); expect(result?.value, 'ok:one-shot'); @@ -128,6 +461,7 @@ void main() { name: 'client.from-url', entrypoint: (context, args) async => 'ok', ); + final definition = TaskDefinition.noArgs(name: 'client.from-url'); final client = await StemClient.fromUrl( 'test://localhost', adapters: [ @@ -146,9 +480,85 @@ void main() { final worker = await client.createWorker(); await worker.start(); try { - final taskId = await client.stem.enqueue('client.from-url'); - final result = await client.stem.waitForTask( - taskId, + final result = await definition.enqueueAndWait( + client, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'ok'); + } finally { + await worker.shutdown(); + await client.close(); + } + }); + + test('StemClient fromUrl reuses a pre-resolved stack', () async { + final handler = FunctionTaskHandler( + name: 'client.from-url.stack', + entrypoint: (context, args) async => 'ok', + ); + final definition = TaskDefinition.noArgs( + name: 'client.from-url.stack', + ); + final stack = StemStack.fromUrl( + 'test://localhost', + adapters: [ + TestStoreAdapter( + scheme: 'test', + adapterName: 'client-test-adapter', + broker: StemBrokerFactory(create: () async => InMemoryBroker()), + backend: StemBackendFactory( + create: () async => InMemoryResultBackend(), + ), + ), + ], + ); + final client = await StemClient.fromStack( + stack, + tasks: [handler], + ); + + final worker = await client.createWorker(); + await worker.start(); + try { + final result = await definition.enqueueAndWait( + client, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'ok'); + } finally { + await worker.shutdown(); + await client.close(); + } + }); + + test('StemClient fromUrl delegates to the same stack-backed path', () async { + final handler = FunctionTaskHandler( + name: 'client.from-url.delegates', + entrypoint: (context, args) async => 'ok', + ); + final definition = TaskDefinition.noArgs( + name: 'client.from-url.delegates', + ); + final client = await StemClient.fromUrl( + 'test://localhost', + adapters: [ + TestStoreAdapter( + scheme: 'test', + adapterName: 'client-test-adapter', + broker: StemBrokerFactory(create: () async => InMemoryBroker()), + backend: StemBackendFactory( + create: () async => InMemoryResultBackend(), + ), + ), + ], + tasks: [handler], + ); + + final worker = await client.createWorker(); + await worker.start(); + try { + final result = await definition.enqueueAndWait( + client, timeout: const Duration(seconds: 2), ); expect(result?.value, 'ok'); @@ -157,4 +567,86 @@ void main() { await client.close(); } }); + + test('StemClient createWorker infers queues from explicit tasks', () async { + final client = await StemClient.inMemory(); + final worker = await client.createWorker( + tasks: [ + FunctionTaskHandler( + name: 'client.worker.explicit.queue', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ), + ], + ); + + expect(worker.subscription.queues, ['priority']); + + await worker.start(); + try { + final taskId = await client.enqueue( + 'client.worker.explicit.queue', + enqueueOptions: const TaskEnqueueOptions(queue: 'priority'), + ); + final result = await client.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'task-ok'); + } finally { + await worker.shutdown(); + await client.close(); + } + }); + + test('StemClient createWorker infers queues from default module', () async { + final client = await StemClient.inMemory( + module: StemModule( + tasks: [ + FunctionTaskHandler( + name: 'client.worker.default-module.queue', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ), + ], + ), + ); + final worker = await client.createWorker(); + + expect(worker.subscription.queues, ['priority']); + + await worker.start(); + try { + final taskId = await client.enqueue( + 'client.worker.default-module.queue', + enqueueOptions: const TaskEnqueueOptions(queue: 'priority'), + ); + final result = await client.waitForTask( + taskId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'task-ok'); + } finally { + await worker.shutdown(); + await client.close(); + } + }); +} + +Future _waitForClientGroupStatus( + Future Function() lookup, { + Duration timeout = const Duration(seconds: 2), + Duration pollInterval = const Duration(milliseconds: 25), +}) async { + final deadline = DateTime.now().add(timeout); + while (DateTime.now().isBefore(deadline)) { + final status = await lookup(); + if (status != null && status.completed == status.expected) { + return status; + } + await Future.delayed(pollInterval); + } + return lookup(); } diff --git a/packages/stem/test/bootstrap/stem_stack_test.dart b/packages/stem/test/bootstrap/stem_stack_test.dart index 54fd37a6..a94c6400 100644 --- a/packages/stem/test/bootstrap/stem_stack_test.dart +++ b/packages/stem/test/bootstrap/stem_stack_test.dart @@ -45,6 +45,30 @@ void main() { expect(workflowStore, isA()); }); + test('can create a client from a resolved stack', () async { + final stack = StemStack.fromUrl('memory://'); + final handler = FunctionTaskHandler( + name: 'stack.client.task', + entrypoint: (context, args) async => 'ok', + ); + final definition = TaskDefinition.noArgs( + name: 'stack.client.task', + ); + final client = await stack.createClient(tasks: [handler]); + final worker = await client.createWorker(); + await worker.start(); + try { + final result = await definition.enqueueAndWait( + client, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'ok'); + } finally { + await worker.shutdown(); + await client.close(); + } + }); + test('honors overrides for specific stores', () { final fooBroker = StemBrokerFactory( create: () async => InMemoryBroker(), diff --git a/packages/stem/test/bootstrap/workflow_module_bootstrap_test.dart b/packages/stem/test/bootstrap/workflow_module_bootstrap_test.dart new file mode 100644 index 00000000..e4ffdc2d --- /dev/null +++ b/packages/stem/test/bootstrap/workflow_module_bootstrap_test.dart @@ -0,0 +1,310 @@ +import 'package:stem/stem.dart'; +import 'package:test/test.dart'; + +void main() { + group('workflow module bootstrap', () { + test('StemWorkflowApp.inMemory merges plural modules', () async { + final flow = Flow( + name: 'workflow.module.modules.flow', + build: (builder) { + builder.step('hello', (ctx) async => 'flow-ok'); + }, + ); + final task = FunctionTaskHandler( + name: 'workflow.module.modules.task', + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final workflowApp = await StemWorkflowApp.inMemory( + modules: [ + StemModule(flows: [flow]), + StemModule(tasks: [task]), + ], + ); + try { + expect( + workflowApp.app.worker.subscription.queues, + unorderedEquals(['workflow', 'default']), + ); + + await workflowApp.start(); + final runId = await workflowApp.startWorkflow(flow.definition.name); + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'flow-ok'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('StemWorkflowApp.inMemory infers workflow and task queues', () async { + final helperTask = FunctionTaskHandler( + name: 'workflow.module.queue-helper', + entrypoint: (context, args) async => 'queued-ok', + runInIsolate: false, + ); + final helperDefinition = TaskDefinition.noArgs( + name: 'workflow.module.queue-helper', + ); + final workflowApp = await StemWorkflowApp.inMemory( + module: StemModule(tasks: [helperTask]), + ); + try { + expect( + workflowApp.app.worker.subscription.queues, + unorderedEquals(['workflow', 'default']), + ); + + await workflowApp.start(); + final result = await helperDefinition.enqueueAndWait( + workflowApp, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'queued-ok'); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'StemWorkflowApp.inMemory infers continuation and execution queues', + () async { + final helperTask = FunctionTaskHandler( + name: 'workflow.module.custom-queues-helper', + entrypoint: (context, args) async => 'queued-ok', + runInIsolate: false, + ); + final workflowApp = await StemWorkflowApp.inMemory( + module: StemModule(tasks: [helperTask]), + continuationQueue: 'workflow-continue', + executionQueue: 'workflow-step', + ); + try { + expect( + workflowApp.app.worker.subscription.queues, + unorderedEquals([ + 'workflow', + 'workflow-continue', + 'workflow-step', + 'default', + ]), + ); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'StemClient.createWorkflowApp forwards continuation and execution ' + 'queues', + () async { + final helperTask = FunctionTaskHandler( + name: 'workflow.module.client-custom-queues-helper', + entrypoint: (context, args) async => 'queued-ok', + runInIsolate: false, + ); + final client = await StemClient.inMemory( + module: StemModule(tasks: [helperTask]), + ); + + final workflowApp = await client.createWorkflowApp( + continuationQueue: 'workflow-continue', + executionQueue: 'workflow-step', + ); + try { + expect( + workflowApp.app.worker.subscription.queues, + unorderedEquals([ + 'workflow', + 'workflow-continue', + 'workflow-step', + 'default', + ]), + ); + } finally { + await workflowApp.shutdown(); + await client.close(); + } + }, + ); + + test( + 'StemWorkflowApp.fromClient falls back to client module for ' + 'subscription inference', + () async { + final queuedTask = FunctionTaskHandler( + name: 'workflow.module.from-client-task', + options: const TaskOptions(queue: 'priority'), + entrypoint: (context, args) async => 'queued-ok', + runInIsolate: false, + ); + final client = await StemClient.inMemory( + module: StemModule(tasks: [queuedTask]), + ); + + final workflowApp = await StemWorkflowApp.fromClient(client: client); + try { + expect( + workflowApp.app.worker.subscription.queues, + unorderedEquals(['workflow', 'priority']), + ); + } finally { + await workflowApp.shutdown(); + await client.close(); + } + }, + ); + + test( + 'explicit workflow subscription overrides inferred module queues', + () async { + final helperTask = FunctionTaskHandler( + name: 'workflow.module.explicit-subscription', + entrypoint: (context, args) async => 'ignored', + runInIsolate: false, + ); + final workflowApp = await StemWorkflowApp.inMemory( + module: StemModule(tasks: [helperTask]), + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription.singleQueue('workflow'), + ), + ); + try { + expect(workflowApp.app.worker.subscription.queues, ['workflow']); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'StemApp.createWorkflowApp rejects missing continuation/execution ' + 'coverage', + () async { + final helperTask = FunctionTaskHandler( + name: 'workflow.module.missing-custom-queues-helper', + entrypoint: (context, args) async => 'queued-ok', + runInIsolate: false, + ); + final stemApp = await StemApp.inMemory( + module: StemModule(tasks: [helperTask]), + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription( + queues: ['workflow', 'default'], + ), + ), + ); + + try { + await expectLater( + () => stemApp.createWorkflowApp( + continuationQueue: 'workflow-continue', + executionQueue: 'workflow-step', + ), + throwsA( + isA().having( + (error) => error.message, + 'message', + allOf( + contains('workflow-continue'), + contains('workflow-step'), + ), + ), + ), + ); + } finally { + await stemApp.close(); + } + }, + ); + + test('registerModules registers flows and tasks together', () async { + final flow = Flow( + name: 'workflow.module.register-modules.flow', + build: (builder) { + builder.step('hello', (ctx) async => 'flow-ok'); + }, + ); + final task = FunctionTaskHandler( + name: 'workflow.module.register-modules.task', + entrypoint: (context, args) async => 'task-ok', + runInIsolate: false, + ); + final taskDefinition = TaskDefinition.noArgs(name: task.name); + final workflowApp = await StemWorkflowApp.inMemory( + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription( + queues: ['workflow', 'default'], + ), + ), + ); + try { + workflowApp.registerModules([ + StemModule(flows: [flow]), + StemModule(tasks: [task]), + ]); + + await workflowApp.start(); + final runId = await workflowApp.startWorkflow(flow.definition.name); + final workflowResult = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + final taskResult = await taskDefinition.enqueueAndWait( + workflowApp, + timeout: const Duration(seconds: 2), + ); + + expect(workflowResult?.value, 'flow-ok'); + expect(taskResult?.value, 'task-ok'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('shutdown preserves a borrowed StemApp', () async { + final hostTask = FunctionTaskHandler( + name: 'workflow.module.host-task', + entrypoint: (context, args) async => 'host-ok', + runInIsolate: false, + ); + final flow = Flow( + name: 'workflow.module.borrowed-app', + build: (builder) { + builder.step('hello', (ctx) async => 'flow-ok'); + }, + ); + final hostApp = await StemApp.inMemory( + tasks: [hostTask], + workerConfig: StemWorkerConfig( + subscription: RoutingSubscription( + queues: ['default', 'workflow'], + ), + ), + ); + final hostTaskDef = TaskDefinition.noArgs(name: hostTask.name); + + await hostApp.start(); + final workflowApp = await hostApp.createWorkflowApp(flows: [flow]); + try { + await workflowApp.shutdown(); + + expect(hostApp.isStarted, isTrue); + + final result = await hostTaskDef.enqueueAndWait( + hostApp, + timeout: const Duration(seconds: 2), + ); + expect(result?.value, 'host-ok'); + } finally { + await hostApp.shutdown(); + } + }); + }); +} diff --git a/packages/stem/test/unit/canvas/canvas_test.dart b/packages/stem/test/unit/canvas/canvas_test.dart index 09b904da..42023d26 100644 --- a/packages/stem/test/unit/canvas/canvas_test.dart +++ b/packages/stem/test/unit/canvas/canvas_test.dart @@ -56,6 +56,27 @@ void main() { await dispatch.dispose(); }); + test('group auto-initializes an explicit groupId when missing', () async { + const groupId = 'group-explicit-id'; + final dispatch = await canvas.group([ + task('echo', args: {'value': 4}), + task('echo', args: {'value': 6}), + ], groupId: groupId); + + final received = await dispatch.results + .map((result) => result.value) + .toList(); + final typed = received.whereType().toList(); + expect(typed, containsAll([4, 6])); + + final group = await backend.getGroup(groupId); + expect(group, isNotNull); + expect(group!.expected, equals(2)); + expect(group.results.length, equals(2)); + + await dispatch.dispose(); + }); + test('chain returns typed payload', () async { final result = await canvas.chain([ task('echo', args: {'value': 1}), diff --git a/packages/stem/test/unit/control/control_messages_test.dart b/packages/stem/test/unit/control/control_messages_test.dart new file mode 100644 index 00000000..6423643b --- /dev/null +++ b/packages/stem/test/unit/control/control_messages_test.dart @@ -0,0 +1,136 @@ +import 'package:stem/src/control/control_messages.dart'; +import 'package:stem/src/core/payload_codec.dart'; +import 'package:test/test.dart'; + +void main() { + test('ControlCommandMessage exposes typed payload helpers', () { + final command = ControlCommandMessage( + requestId: 'req-1', + type: 'pause', + targets: const ['*'], + payload: const { + PayloadCodec.versionKey: 2, + 'queue': 'priority', + 'paused': true, + }, + ); + + expect(command.payloadValue('queue'), 'priority'); + expect(command.payloadValueOr('missing', 'fallback'), 'fallback'); + expect(command.requiredPayloadValue('paused'), isTrue); + expect( + command.payloadJson<_ControlPayload>(decode: _ControlPayload.fromJson), + isA<_ControlPayload>() + .having((value) => value.queue, 'queue', 'priority') + .having((value) => value.paused, 'paused', isTrue), + ); + expect( + command.payloadVersionedJson<_ControlPayload>( + version: 2, + decode: _ControlPayload.fromVersionedJson, + ), + isA<_ControlPayload>() + .having((value) => value.queue, 'queue', 'priority') + .having((value) => value.paused, 'paused', isTrue), + ); + }); + + test('ControlReplyMessage exposes typed payload and error helpers', () { + final reply = ControlReplyMessage( + requestId: 'req-2', + workerId: 'worker-1', + status: 'error', + payload: const { + PayloadCodec.versionKey: 2, + 'queue': 'priority', + 'paused': true, + }, + error: const { + PayloadCodec.versionKey: 2, + 'code': 'pause_failed', + 'message': 'already paused', + }, + ); + + expect(reply.payloadValue('queue'), 'priority'); + expect(reply.payloadValueOr('missing', 'fallback'), 'fallback'); + expect(reply.requiredPayloadValue('paused'), isTrue); + expect( + reply.payloadJson<_ControlPayload>(decode: _ControlPayload.fromJson), + isA<_ControlPayload>() + .having((value) => value.queue, 'queue', 'priority') + .having((value) => value.paused, 'paused', isTrue), + ); + expect( + reply.payloadVersionedJson<_ControlPayload>( + version: 2, + decode: _ControlPayload.fromVersionedJson, + ), + isA<_ControlPayload>() + .having((value) => value.queue, 'queue', 'priority') + .having((value) => value.paused, 'paused', isTrue), + ); + expect(reply.errorValue('code'), 'pause_failed'); + expect(reply.errorValueOr('missing', 'fallback'), 'fallback'); + expect(reply.requiredErrorValue('message'), 'already paused'); + expect( + reply.errorJson<_ControlError>(decode: _ControlError.fromJson), + isA<_ControlError>() + .having((value) => value.code, 'code', 'pause_failed') + .having((value) => value.message, 'message', 'already paused'), + ); + expect( + reply.errorVersionedJson<_ControlError>( + version: 2, + decode: _ControlError.fromVersionedJson, + ), + isA<_ControlError>() + .having((value) => value.code, 'code', 'pause_failed') + .having((value) => value.message, 'message', 'already paused'), + ); + }); +} + +class _ControlPayload { + const _ControlPayload({required this.queue, required this.paused}); + + factory _ControlPayload.fromJson(Map json) { + return _ControlPayload( + queue: json['queue'] as String, + paused: json['paused'] as bool, + ); + } + + factory _ControlPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ControlPayload.fromJson(json); + } + + final String queue; + final bool paused; +} + +class _ControlError { + const _ControlError({required this.code, required this.message}); + + factory _ControlError.fromJson(Map json) { + return _ControlError( + code: json['code'] as String, + message: json['message'] as String, + ); + } + + factory _ControlError.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ControlError.fromJson(json); + } + + final String code; + final String message; +} diff --git a/packages/stem/test/unit/core/contracts_test.dart b/packages/stem/test/unit/core/contracts_test.dart index 3e6c6a3a..402c3c8b 100644 --- a/packages/stem/test/unit/core/contracts_test.dart +++ b/packages/stem/test/unit/core/contracts_test.dart @@ -1,5 +1,6 @@ import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/scheduler/schedule_spec.dart'; import 'package:test/test.dart'; @@ -89,6 +90,7 @@ void main() { state: TaskState.failed, attempt: 1, meta: const { + PayloadCodec.versionKey: 2, 'task': 'email.send', 'queue': 'critical', 'namespace': 'acme', @@ -121,6 +123,22 @@ void main() { }, ); + expect( + status.metaJson<_TaskStatusMeta>(decode: _TaskStatusMeta.fromJson), + isA<_TaskStatusMeta>() + .having((value) => value.task, 'task', 'email.send') + .having((value) => value.queue, 'queue', 'critical'), + ); + expect( + status.metaVersionedJson<_TaskStatusMeta>( + version: 2, + decode: _TaskStatusMeta.fromVersionedJson, + ), + isA<_TaskStatusMeta>() + .having((value) => value.task, 'task', 'email.send') + .having((value) => value.queue, 'queue', 'critical'), + ); + expect(status.taskName, equals('email.send')); expect(status.queueName, equals('critical')); expect(status.namespace, equals('acme')); @@ -151,6 +169,194 @@ void main() { expect(status.workflowSerializationVersion, equals('1')); expect(status.workflowStreamId, equals('invoice_run-123')); }); + + test('payload helpers decode stored values', () { + final status = TaskStatus( + id: 'task-4', + state: TaskState.succeeded, + attempt: 0, + payload: const {'id': 'receipt-1'}, + ); + final codec = PayloadCodec>.map( + encode: (value) => value, + decode: (json) => json, + typeName: 'ReceiptMap', + ); + + expect( + status.payloadValue>(), + equals(const {'id': 'receipt-1'}), + ); + expect( + status.payloadValue>(codec: codec), + equals(const {'id': 'receipt-1'}), + ); + expect( + status.payloadValueOr>( + const {'id': 'fallback'}, + codec: codec, + ), + equals(const {'id': 'receipt-1'}), + ); + expect( + status.requiredPayloadValue>(codec: codec), + equals(const {'id': 'receipt-1'}), + ); + expect( + status.payloadAs>(codec: codec), + equals(const {'id': 'receipt-1'}), + ); + expect( + status.payloadJson<_ReceiptPayload>( + decode: _ReceiptPayload.fromJson, + ), + isA<_ReceiptPayload>().having((value) => value.id, 'id', 'receipt-1'), + ); + expect( + status.payloadVersionedJson<_ReceiptPayload>( + version: 2, + decode: _ReceiptPayload.fromVersionedJson, + ), + isA<_ReceiptPayload>().having((value) => value.id, 'id', 'receipt-1'), + ); + }); + + test('error metadata helpers decode structured values', () { + const error = TaskError( + type: 'Boom', + message: 'fail', + meta: { + PayloadCodec.versionKey: 2, + 'queue': 'default', + }, + ); + + expect( + error.metaJson<_ErrorMeta>(decode: _ErrorMeta.fromJson), + isA<_ErrorMeta>().having((value) => value.queue, 'queue', 'default'), + ); + expect( + error.metaVersionedJson<_ErrorMeta>( + version: 2, + decode: _ErrorMeta.fromVersionedJson, + ), + isA<_ErrorMeta>().having((value) => value.queue, 'queue', 'default'), + ); + }); + + test('requiredPayloadValue throws when payload is absent', () { + final status = TaskStatus( + id: 'task-5', + state: TaskState.failed, + attempt: 1, + ); + + expect( + status.requiredPayloadValue>, + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('task-5'), + ), + ), + ); + expect( + status.payloadValueOr>( + const {'id': 'fallback'}, + ), + equals(const {'id': 'fallback'}), + ); + expect( + status.payloadAs>( + codec: PayloadCodec>.map( + encode: (value) => value, + decode: (json) => json, + typeName: 'ReceiptMap', + ), + ), + isNull, + ); + expect( + status.payloadJson<_ReceiptPayload>( + decode: _ReceiptPayload.fromJson, + ), + isNull, + ); + }); + }); + + group('GroupStatus', () { + test('exposes typed child-result decode helpers', () { + final codec = PayloadCodec>.map( + encode: (value) => value, + decode: (json) => json, + typeName: 'ReceiptMap', + ); + final scalarStatus = GroupStatus( + id: 'grp-1', + expected: 2, + results: { + 'task-1': TaskStatus( + id: 'task-1', + state: TaskState.succeeded, + attempt: 0, + payload: 7, + ), + 'task-2': TaskStatus( + id: 'task-2', + state: TaskState.succeeded, + attempt: 0, + payload: 9, + ), + }, + ); + final dtoStatus = GroupStatus( + id: 'grp-2', + expected: 1, + results: { + 'task-1': TaskStatus( + id: 'task-1', + state: TaskState.succeeded, + attempt: 0, + payload: const {'id': 'receipt-1'}, + ), + }, + ); + + expect( + scalarStatus.resultValues(), + equals({ + 'task-1': 7, + 'task-2': 9, + }), + ); + expect( + dtoStatus.resultAs>(codec: codec), + equals({ + 'task-1': const {'id': 'receipt-1'}, + }), + ); + expect( + dtoStatus.resultJson<_GroupReceipt>( + decode: _GroupReceipt.fromJson, + ), + { + 'task-1': isA<_GroupReceipt>() + .having((value) => value.id, 'id', 'receipt-1'), + }, + ); + expect( + dtoStatus.resultVersionedJson<_GroupReceipt>( + version: 2, + decode: _GroupReceipt.fromVersionedJson, + ), + { + 'task-1': isA<_GroupReceipt>() + .having((value) => value.id, 'id', 'receipt-1'), + }, + ); + }); }); group('DeadLetterEntry', () { @@ -169,6 +375,29 @@ void main() { expect(decoded.meta['trace'], equals('abc')); expect(decoded.deadAt, equals(DateTime.utc(2025))); }); + + test('exposes typed metadata helpers', () { + final entry = DeadLetterEntry( + envelope: Envelope(name: 'task', args: const {}), + deadAt: DateTime.utc(2025), + meta: const { + PayloadCodec.versionKey: 2, + 'trace': 'abc', + }, + ); + + expect( + entry.metaJson<_TraceMeta>(decode: _TraceMeta.fromJson), + isA<_TraceMeta>().having((value) => value.trace, 'trace', 'abc'), + ); + expect( + entry.metaVersionedJson<_TraceMeta>( + version: 2, + decode: _TraceMeta.fromVersionedJson, + ), + isA<_TraceMeta>().having((value) => value.trace, 'trace', 'abc'), + ); + }); }); group('DeadLetterPage/ReplayResult', () { @@ -300,6 +529,71 @@ void main() { expect(updated.lastError, isNull); expect(updated.enabled, isFalse); }); + + test('exposes typed args, kwargs, and metadata helpers', () { + final entry = ScheduleEntry( + id: 'schedule-typed', + taskName: 'task', + queue: 'default', + spec: IntervalScheduleSpec(every: const Duration(minutes: 1)), + args: const { + PayloadCodec.versionKey: 2, + 'value': 1, + }, + kwargs: const { + PayloadCodec.versionKey: 2, + 'label': 'nightly', + }, + meta: const { + PayloadCodec.versionKey: 2, + 'source': 'scheduler', + }, + ); + + expect( + entry.argsJson<_ScheduleArgs>(decode: _ScheduleArgs.fromJson), + isA<_ScheduleArgs>().having((value) => value.value, 'value', 1), + ); + expect( + entry.argsVersionedJson<_ScheduleArgs>( + version: 2, + decode: _ScheduleArgs.fromVersionedJson, + ), + isA<_ScheduleArgs>().having((value) => value.value, 'value', 1), + ); + expect( + entry.kwargsJson<_ScheduleKwargs>(decode: _ScheduleKwargs.fromJson), + isA<_ScheduleKwargs>().having( + (value) => value.label, + 'label', + 'nightly', + ), + ); + expect( + entry.kwargsVersionedJson<_ScheduleKwargs>( + version: 2, + decode: _ScheduleKwargs.fromVersionedJson, + ), + isA<_ScheduleKwargs>().having( + (value) => value.label, + 'label', + 'nightly', + ), + ); + expect( + entry.metaJson<_ScheduleMeta>(decode: _ScheduleMeta.fromJson), + isA<_ScheduleMeta>() + .having((value) => value.source, 'source', 'scheduler'), + ); + expect( + entry.metaVersionedJson<_ScheduleMeta>( + version: 2, + decode: _ScheduleMeta.fromVersionedJson, + ), + isA<_ScheduleMeta>() + .having((value) => value.source, 'source', 'scheduler'), + ); + }); }); test('ScheduleConflictException string includes metadata', () { @@ -314,3 +608,151 @@ void main() { expect(error.toString(), contains('actual: 2')); }); } + +class _GroupReceipt { + const _GroupReceipt({required this.id}); + + factory _GroupReceipt.fromJson(Map json) { + return _GroupReceipt(id: json['id'] as String); + } + + factory _GroupReceipt.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _GroupReceipt(id: json['id'] as String); + } + + final String id; +} + +class _ReceiptPayload { + const _ReceiptPayload({required this.id}); + + factory _ReceiptPayload.fromJson(Map json) { + return _ReceiptPayload(id: json['id'] as String); + } + + factory _ReceiptPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ReceiptPayload(id: json['id'] as String); + } + + final String id; +} + +class _TraceMeta { + const _TraceMeta({required this.trace}); + + factory _TraceMeta.fromJson(Map json) { + return _TraceMeta(trace: json['trace'] as String); + } + + factory _TraceMeta.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _TraceMeta.fromJson(json); + } + + final String trace; +} + +class _ScheduleArgs { + const _ScheduleArgs({required this.value}); + + factory _ScheduleArgs.fromJson(Map json) { + return _ScheduleArgs(value: json['value'] as int); + } + + factory _ScheduleArgs.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ScheduleArgs.fromJson(json); + } + + final int value; +} + +class _ScheduleKwargs { + const _ScheduleKwargs({required this.label}); + + factory _ScheduleKwargs.fromJson(Map json) { + return _ScheduleKwargs(label: json['label'] as String); + } + + factory _ScheduleKwargs.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ScheduleKwargs.fromJson(json); + } + + final String label; +} + +class _ScheduleMeta { + const _ScheduleMeta({required this.source}); + + factory _ScheduleMeta.fromJson(Map json) { + return _ScheduleMeta(source: json['source'] as String); + } + + factory _ScheduleMeta.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ScheduleMeta.fromJson(json); + } + + final String source; +} + +class _ErrorMeta { + const _ErrorMeta({required this.queue}); + + factory _ErrorMeta.fromJson(Map json) { + return _ErrorMeta(queue: json['queue'] as String); + } + + factory _ErrorMeta.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ErrorMeta.fromJson(json); + } + + final String queue; +} + +class _TaskStatusMeta { + const _TaskStatusMeta({required this.task, required this.queue}); + + factory _TaskStatusMeta.fromJson(Map json) { + return _TaskStatusMeta( + task: json['task'] as String, + queue: json['queue'] as String, + ); + } + + factory _TaskStatusMeta.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _TaskStatusMeta.fromJson(json); + } + + final String task; + final String queue; +} diff --git a/packages/stem/test/unit/core/fake_stem_test.dart b/packages/stem/test/unit/core/fake_stem_test.dart index c57a5cb8..d170b467 100644 --- a/packages/stem/test/unit/core/fake_stem_test.dart +++ b/packages/stem/test/unit/core/fake_stem_test.dart @@ -15,7 +15,7 @@ void main() { encodeArgs: (args) => {'value': args.value}, ); - final call = definition(const _Args(42)); + final call = definition.buildCall(const _Args(42)); final id = await fake.enqueueCall(call); expect(id, isNotEmpty); diff --git a/packages/stem/test/unit/core/function_task_handler_test.dart b/packages/stem/test/unit/core/function_task_handler_test.dart index 2ddeeb79..f62db6ef 100644 --- a/packages/stem/test/unit/core/function_task_handler_test.dart +++ b/packages/stem/test/unit/core/function_task_handler_test.dart @@ -9,6 +9,7 @@ void main() { Duration? extended; double? progressValue; Map? progressData; + String? argValue; final handler = FunctionTaskHandler( name: 'math.add', @@ -16,6 +17,7 @@ void main() { invocation.heartbeat(); await invocation.extendLease(const Duration(seconds: 3)); await invocation.progress(0.5, data: {'stage': 'halfway'}); + argValue = invocation.requiredArg('name'); final a = args['a']! as int; final b = args['b']! as int; return a + b; @@ -39,10 +41,11 @@ void main() { progressData = data; }, ), - const {'a': 2, 'b': 3}, + const {'a': 2, 'b': 3, 'name': 'stem'}, ); expect(result, equals(5)); + expect(argValue, equals('stem')); expect(handler.isolateEntrypoint, isNotNull); expect(heartbeats, equals(1)); expect(extended, equals(const Duration(seconds: 3))); diff --git a/packages/stem/test/unit/core/payload_codec_test.dart b/packages/stem/test/unit/core/payload_codec_test.dart new file mode 100644 index 00000000..b26fcacb --- /dev/null +++ b/packages/stem/test/unit/core/payload_codec_test.dart @@ -0,0 +1,460 @@ +import 'package:stem/stem.dart'; +import 'package:test/test.dart'; + +void main() { + group('PayloadCodec.json', () { + test('encodes and decodes DTOs via toJson/fromJson', () { + const codec = PayloadCodec<_CodecPayload>.json( + decode: _CodecPayload.fromJson, + typeName: '_CodecPayload', + ); + + final payload = codec.encode( + const _CodecPayload(id: 'payload-0', count: 1), + ); + final decoded = codec.decode(payload); + + expect(payload, { + 'id': 'payload-0', + 'count': 1, + }); + expect(decoded.id, 'payload-0'); + expect(decoded.count, 1); + }); + + test('accepts DTO decoders that use Map', () { + const codec = PayloadCodec<_DynamicCodecPayload>.json( + decode: _DynamicCodecPayload.fromJson, + typeName: '_DynamicCodecPayload', + ); + + final payload = codec.encode( + const _DynamicCodecPayload(id: 'payload-dyn', count: 9), + ); + final decoded = codec.decode(payload); + + expect(payload, { + 'id': 'payload-dyn', + 'count': 9, + }); + expect(decoded.id, 'payload-dyn'); + expect(decoded.count, 9); + }); + + test('rejects values without toJson with a clear error', () { + const codec = PayloadCodec<_NoJsonPayload>.json( + decode: _NoJsonPayload.fromJson, + typeName: '_NoJsonPayload', + ); + + expect( + () => codec.encode(const _NoJsonPayload(id: 'missing')), + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('_NoJsonPayload must expose toJson()'), + ), + ), + ); + }); + }); + + group('PayloadCodec.versionedJson', () { + test('encodes DTOs to versioned JSON maps without a codec instance', () { + final payload = PayloadCodec.encodeVersionedJsonMap( + const _VersionedCodecPayload(id: 'payload-v-encode', count: 6), + version: 4, + typeName: '_VersionedCodecPayload', + ); + + expect(payload, { + PayloadCodec.versionKey: 4, + 'id': 'payload-v-encode', + 'count': 6, + }); + }); + + test('encodes DTOs with a persisted schema version', () { + const codec = PayloadCodec<_VersionedCodecPayload>.versionedJson( + version: 2, + decode: _VersionedCodecPayload.fromVersionedJson, + typeName: '_VersionedCodecPayload', + ); + + final payload = codec.encode( + const _VersionedCodecPayload(id: 'payload-v0', count: 4), + ); + + expect(payload, { + PayloadCodec.versionKey: 2, + 'id': 'payload-v0', + 'count': 4, + }); + }); + + test('passes the persisted schema version to the decoder', () { + const codec = PayloadCodec<_VersionedCodecPayload>.versionedJson( + version: 2, + decode: _VersionedCodecPayload.fromVersionedJson, + typeName: '_VersionedCodecPayload', + ); + + final decoded = codec.decode({ + PayloadCodec.versionKey: 3, + 'id': 'payload-v1', + 'count': 8, + }); + + expect(decoded.id, 'payload-v1'); + expect(decoded.count, 8); + expect(decoded.decodedVersion, 3); + }); + + test('falls back to the configured default decode version', () { + const codec = PayloadCodec<_VersionedCodecPayload>.versionedJson( + version: 3, + defaultDecodeVersion: 1, + decode: _VersionedCodecPayload.fromVersionedJson, + typeName: '_VersionedCodecPayload', + ); + + final decoded = codec.decode({ + 'id': 'payload-v2', + 'count': 11, + }); + + expect(decoded.id, 'payload-v2'); + expect(decoded.count, 11); + expect(decoded.decodedVersion, 1); + }); + + test('rejects invalid persisted schema versions with a clear error', () { + const codec = PayloadCodec<_VersionedCodecPayload>.versionedJson( + version: 2, + decode: _VersionedCodecPayload.fromVersionedJson, + typeName: '_VersionedCodecPayload', + ); + + expect( + () => codec.decode({ + PayloadCodec.versionKey: true, + 'id': 'payload-v3', + 'count': 13, + }), + throwsA( + isA().having( + (error) => error.message, + 'message', + contains( + '_VersionedCodecPayload payload version must be an ' + 'int-compatible ' + 'value', + ), + ), + ), + ); + }); + }); + + group('PayloadCodec.map', () { + test('decodes typed DTO payloads from durable maps', () { + const codec = PayloadCodec<_CodecPayload>.map( + encode: _encodeCodecPayload, + decode: _CodecPayload.fromJson, + typeName: '_CodecPayload', + ); + + final decoded = codec.decode({ + 'id': 'payload-1', + 'count': 3, + }); + + expect(decoded.id, 'payload-1'); + expect(decoded.count, 3); + }); + + test('normalizes generic map payloads before decoding', () { + const codec = PayloadCodec<_CodecPayload>.map( + encode: _encodeCodecPayload, + decode: _CodecPayload.fromJson, + typeName: '_CodecPayload', + ); + + final decoded = codec.decode({ + 'id': 'payload-2', + 'count': 7, + }); + + expect(decoded.id, 'payload-2'); + expect(decoded.count, 7); + }); + + test('rejects non-map payloads with a clear error', () { + const codec = PayloadCodec<_CodecPayload>.map( + encode: _encodeCodecPayload, + decode: _CodecPayload.fromJson, + typeName: '_CodecPayload', + ); + + expect( + () => codec.decode('not-a-map'), + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('_CodecPayload payload must decode to a string-keyed map'), + ), + ), + ); + }); + + test('rejects non-string map keys with a clear error', () { + const codec = PayloadCodec<_CodecPayload>.map( + encode: _encodeCodecPayload, + decode: _CodecPayload.fromJson, + typeName: '_CodecPayload', + ); + + expect( + () => codec.decode({1: 'bad'}), + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('_CodecPayload payload must use string keys.'), + ), + ), + ); + }); + }); + + group('PayloadVersionRegistry', () { + const registry = PayloadVersionRegistry<_VersionedCodecPayload>( + decoders: )>{ + 1: _VersionedCodecPayload.fromV1Json, + 2: _VersionedCodecPayload.fromV2Json, + }, + defaultVersion: 1, + ); + + test('decodes versioned JSON payloads through a reusable registry', () { + final codec = PayloadCodec<_VersionedCodecPayload>.versionedJsonRegistry( + version: 2, + registry: registry, + typeName: '_VersionedCodecPayload', + ); + + final decoded = codec.decode({ + PayloadCodec.versionKey: 2, + 'id': 'payload-registry-v2', + 'count': 21, + }); + + expect(decoded.id, 'payload-registry-v2'); + expect(decoded.count, 21); + expect(decoded.decodedVersion, 2); + }); + + test('uses the registry default version when the payload has none', () { + final codec = PayloadCodec<_VersionedCodecPayload>.versionedJsonRegistry( + version: 2, + registry: registry, + typeName: '_VersionedCodecPayload', + ); + + final decoded = codec.decode({ + 'legacy_id': 'payload-registry-v1', + 'amount': 18, + }); + + expect(decoded.id, 'payload-registry-v1'); + expect(decoded.count, 18); + expect(decoded.decodedVersion, 1); + }); + + test('rejects unknown payload versions with a clear error', () { + final codec = PayloadCodec<_VersionedCodecPayload>.versionedJsonRegistry( + version: 2, + registry: registry, + typeName: '_VersionedCodecPayload', + ); + + expect( + () => codec.decode({ + PayloadCodec.versionKey: 9, + 'id': 'payload-registry-unknown', + 'count': 22, + }), + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('has no decoder registered for payload version 9'), + ), + ), + ); + }); + }); + + group('PayloadCodec.versionedMap', () { + test('encodes custom map payloads with a persisted schema version', () { + const codec = PayloadCodec<_VersionedCodecPayload>.versionedMap( + encode: _encodeVersionedCodecPayloadMap, + version: 4, + decode: _VersionedCodecPayload.fromVersionedJson, + typeName: '_VersionedCodecPayload', + ); + + final payload = codec.encode( + const _VersionedCodecPayload(id: 'payload-map-v0', count: 12), + ); + + expect(payload, { + PayloadCodec.versionKey: 4, + 'id': 'payload-map-v0', + 'count': 12, + 'legacy': true, + }); + }); + + test('passes the stored schema version to the custom decoder', () { + const codec = PayloadCodec<_VersionedCodecPayload>.versionedMap( + encode: _encodeVersionedCodecPayloadMap, + version: 2, + decode: _VersionedCodecPayload.fromVersionedJson, + typeName: '_VersionedCodecPayload', + ); + + final decoded = codec.decode({ + PayloadCodec.versionKey: 7, + 'id': 'payload-map-v1', + 'count': 5, + 'legacy': true, + }); + + expect(decoded.id, 'payload-map-v1'); + expect(decoded.count, 5); + expect(decoded.decodedVersion, 7); + }); + + test('falls back to the configured default decode version', () { + const codec = PayloadCodec<_VersionedCodecPayload>.versionedMap( + encode: _encodeVersionedCodecPayloadMap, + version: 3, + defaultDecodeVersion: 1, + decode: _VersionedCodecPayload.fromVersionedJson, + typeName: '_VersionedCodecPayload', + ); + + final decoded = codec.decode({ + 'id': 'payload-map-v2', + 'count': 14, + 'legacy': true, + }); + + expect(decoded.id, 'payload-map-v2'); + expect(decoded.count, 14); + expect(decoded.decodedVersion, 1); + }); + }); +} + +class _CodecPayload { + const _CodecPayload({required this.id, required this.count}); + + factory _CodecPayload.fromJson(Map json) { + return _CodecPayload( + id: json['id']! as String, + count: json['count']! as int, + ); + } + + final String id; + final int count; + + Map toJson() => { + 'id': id, + 'count': count, + }; +} + +class _DynamicCodecPayload { + const _DynamicCodecPayload({required this.id, required this.count}); + + factory _DynamicCodecPayload.fromJson(Map json) { + return _DynamicCodecPayload( + id: json['id']! as String, + count: json['count']! as int, + ); + } + + final String id; + final int count; + + Map toJson() => { + 'id': id, + 'count': count, + }; +} + +Object? _encodeCodecPayload(_CodecPayload value) => value.toJson(); + +Object? _encodeVersionedCodecPayloadMap(_VersionedCodecPayload value) => { + ...value.toJson(), + 'legacy': true, +}; + +class _NoJsonPayload { + const _NoJsonPayload({required this.id}); + + factory _NoJsonPayload.fromJson(Map json) { + return _NoJsonPayload(id: json['id']! as String); + } + + final String id; +} + +class _VersionedCodecPayload { + const _VersionedCodecPayload({ + required this.id, + required this.count, + this.decodedVersion, + }); + + factory _VersionedCodecPayload.fromVersionedJson( + Map json, + int version, + ) { + return _VersionedCodecPayload( + id: json['id']! as String, + count: json['count']! as int, + decodedVersion: version, + ); + } + + factory _VersionedCodecPayload.fromV1Json(Map json) { + return _VersionedCodecPayload( + id: json['legacy_id']! as String, + count: json['amount']! as int, + decodedVersion: 1, + ); + } + + factory _VersionedCodecPayload.fromV2Json(Map json) { + return _VersionedCodecPayload( + id: json['id']! as String, + count: json['count']! as int, + decodedVersion: 2, + ); + } + + final String id; + final int count; + final int? decodedVersion; + + Map toJson() => { + 'id': id, + 'count': count, + }; +} diff --git a/packages/stem/test/unit/core/payload_map_test.dart b/packages/stem/test/unit/core/payload_map_test.dart new file mode 100644 index 00000000..8279d1cc --- /dev/null +++ b/packages/stem/test/unit/core/payload_map_test.dart @@ -0,0 +1,235 @@ +import 'package:stem/stem.dart'; +import 'package:test/test.dart'; + +void main() { + group('PayloadMapX', () { + test('value reads typed scalar values', () { + const payload = {'name': 'Stem'}; + + expect(payload.value('name'), 'Stem'); + expect(payload.value('missing'), isNull); + }); + + test('valueOr returns fallback for missing values', () { + const payload = {'name': 'Stem'}; + + expect(payload.valueOr('name', 'fallback'), 'Stem'); + expect(payload.valueOr('tenant', 'global'), 'global'); + }); + + test('requiredValue throws for missing payload keys', () { + const payload = {'name': 'Stem'}; + + expect( + () => payload.requiredValue('tenant'), + throwsA( + isA().having( + (error) => error.message, + 'message', + "Missing required payload key 'tenant'.", + ), + ), + ); + }); + + test('requiredValue decodes codec-backed DTO values', () { + final payload = { + 'draft': const {'documentId': 'doc-42'}, + }; + + final draft = payload.requiredValue<_ApprovalDraft>( + 'draft', + codec: _approvalDraftCodec, + ); + + expect(draft.documentId, 'doc-42'); + }); + + test('valueJson decodes DTO values without a codec constant', () { + final payload = { + 'draft': const {'documentId': 'doc-42'}, + }; + + final draft = payload.valueJson<_ApprovalDraft>( + 'draft', + decode: _ApprovalDraft.fromJson, + ); + + expect(draft?.documentId, 'doc-42'); + }); + + test('valueVersionedJson decodes DTO values without a codec constant', () { + final payload = { + 'draft': const { + PayloadCodec.versionKey: 2, + 'documentId': 'doc-42', + }, + }; + + final draft = payload.valueVersionedJson<_ApprovalDraft>( + 'draft', + defaultVersion: 2, + decode: _ApprovalDraft.fromVersionedJson, + ); + + expect(draft?.documentId, 'doc-42'); + }); + + test('requiredValueJson throws for missing payload keys', () { + const payload = {'name': 'Stem'}; + + expect( + () => payload.requiredValueJson<_ApprovalDraft>( + 'draft', + decode: _ApprovalDraft.fromJson, + ), + throwsA( + isA().having( + (error) => error.message, + 'message', + "Missing required payload key 'draft'.", + ), + ), + ); + }); + + test('valueList reads typed scalar lists', () { + const payload = { + 'scores': [1, 2, 3], + }; + + expect(payload.valueList('scores'), [1, 2, 3]); + expect(payload.valueList('missing'), isNull); + }); + + test('valueListOr returns fallback for missing lists', () { + const payload = { + 'scores': [1, 2, 3], + }; + + expect(payload.valueListOr('scores', const [9]), [1, 2, 3]); + expect(payload.valueListOr('missing', const [9]), [9]); + }); + + test('requiredValueList throws for missing payload keys', () { + const payload = {'name': 'Stem'}; + + expect( + () => payload.requiredValueList('labels'), + throwsA( + isA().having( + (error) => error.message, + 'message', + "Missing required payload key 'labels'.", + ), + ), + ); + }); + + test('requiredValueList decodes codec-backed DTO lists', () { + final payload = { + 'drafts': const [ + {'documentId': 'doc-42'}, + {'documentId': 'doc-99'}, + ], + }; + + final drafts = payload.requiredValueList<_ApprovalDraft>( + 'drafts', + codec: _approvalDraftCodec, + ); + + expect(drafts.map((draft) => draft.documentId), ['doc-42', 'doc-99']); + }); + + test('valueListJson decodes DTO lists without a codec constant', () { + final payload = { + 'drafts': const [ + {'documentId': 'doc-42'}, + {'documentId': 'doc-99'}, + ], + }; + + final drafts = payload.valueListJson<_ApprovalDraft>( + 'drafts', + decode: _ApprovalDraft.fromJson, + ); + + expect( + drafts?.map((draft) => draft.documentId).toList(), + ['doc-42', 'doc-99'], + ); + }); + + test( + 'valueListVersionedJson decodes DTO lists without a codec constant', + () { + final payload = { + 'drafts': const [ + { + PayloadCodec.versionKey: 2, + 'documentId': 'doc-42', + }, + { + PayloadCodec.versionKey: 2, + 'documentId': 'doc-99', + }, + ], + }; + + final drafts = payload.valueListVersionedJson<_ApprovalDraft>( + 'drafts', + defaultVersion: 2, + decode: _ApprovalDraft.fromVersionedJson, + ); + + expect( + drafts?.map((draft) => draft.documentId).toList(), + ['doc-42', 'doc-99'], + ); + }, + ); + + test('requiredValueListJson throws for missing payload keys', () { + const payload = {'name': 'Stem'}; + + expect( + () => payload.requiredValueListJson<_ApprovalDraft>( + 'drafts', + decode: _ApprovalDraft.fromJson, + ), + throwsA( + isA().having( + (error) => error.message, + 'message', + "Missing required payload key 'drafts'.", + ), + ), + ); + }); + }); +} + +const _approvalDraftCodec = PayloadCodec<_ApprovalDraft>.json( + decode: _ApprovalDraft.fromJson, +); + +class _ApprovalDraft { + const _ApprovalDraft({required this.documentId}); + + factory _ApprovalDraft.fromJson(Map json) { + return _ApprovalDraft(documentId: json['documentId'] as String); + } + + factory _ApprovalDraft.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ApprovalDraft(documentId: json['documentId'] as String); + } + + final String documentId; + + Map toJson() => {'documentId': documentId}; +} diff --git a/packages/stem/test/unit/core/queue_events_test.dart b/packages/stem/test/unit/core/queue_events_test.dart index 29e739ae..9fd8c7ba 100644 --- a/packages/stem/test/unit/core/queue_events_test.dart +++ b/packages/stem/test/unit/core/queue_events_test.dart @@ -40,16 +40,35 @@ void main() { 'order.created', payload: const {'orderId': 'o-1'}, headers: const {'x-source': 'test'}, - meta: const {'tenant': 'acme'}, + meta: const {PayloadCodec.versionKey: 2, 'tenant': 'acme'}, ); final event = await received; expect(event.id, eventId); expect(event.queue, 'orders'); expect(event.name, 'order.created'); - expect(event.payload['orderId'], 'o-1'); + expect(event.requiredPayloadValue('orderId'), 'o-1'); expect(event.headers['x-source'], 'test'); expect(event.meta['tenant'], 'acme'); + expect( + event.metaJson<_QueueEventMeta>(decode: _QueueEventMeta.fromJson), + isA<_QueueEventMeta>().having( + (value) => value.tenant, + 'tenant', + 'acme', + ), + ); + expect( + event.metaVersionedJson<_QueueEventMeta>( + version: 2, + decode: _QueueEventMeta.fromVersionedJson, + ), + isA<_QueueEventMeta>().having( + (value) => value.tenant, + 'tenant', + 'acme', + ), + ); }); test('ignores events from other queues', () async { @@ -112,10 +131,134 @@ void main() { final results = await Future.wait([firstA, firstB]); expect(results, hasLength(2)); - expect(results[0].payload['status'], 'paid'); - expect(results[1].payload['status'], 'paid'); + expect(results[0].requiredPayloadValue('status'), 'paid'); + expect(results[1].requiredPayloadValue('status'), 'paid'); + }); + + test('emitJson publishes DTO payloads without a manual map', () async { + final listener = QueueEvents( + broker: broker, + queue: 'orders', + consumerName: 'orders-listener', + ); + await listener.start(); + addTearDown(listener.close); + + final received = listener + .on('order.shipped') + .first + .timeout(const Duration(seconds: 5)); + + final eventId = await producer.emitJson( + 'orders', + 'order.shipped', + const _QueueEventPayload(orderId: 'o-2', status: 'shipped'), + ); + + final event = await received; + expect(event.id, eventId); + expect(event.requiredPayloadValue('orderId'), 'o-2'); + expect(event.payloadValueOr('status', 'pending'), 'shipped'); + expect( + event.payloadJson<_QueueEventPayload>( + decode: _QueueEventPayload.fromJson, + ), + isA<_QueueEventPayload>() + .having((value) => value.orderId, 'orderId', 'o-2') + .having((value) => value.status, 'status', 'shipped'), + ); }); + test( + 'emitValue publishes typed payloads through the supplied codec', + () async { + final listener = QueueEvents( + broker: broker, + queue: 'orders', + consumerName: 'orders-listener-codec', + ); + await listener.start(); + addTearDown(listener.close); + + final received = listener + .on('order.codec') + .first + .timeout(const Duration(seconds: 5)); + + final eventId = await producer.emitValue( + 'orders', + 'order.codec', + const _QueueEventPayload(orderId: 'o-2b', status: 'codec'), + codec: const PayloadCodec<_QueueEventPayload>.map( + encode: _encodeQueueEventPayloadMap, + decode: _QueueEventPayload.fromJson, + typeName: '_QueueEventPayload', + ), + ); + + final event = await received; + expect(event.id, eventId); + expect(event.requiredPayloadValue('orderId'), 'o-2b'); + expect(event.requiredPayloadValue('status'), 'codec'); + expect(event.requiredPayloadValue('kind'), 'custom'); + expect( + event.payloadAs<_QueueEventPayload>( + codec: const PayloadCodec<_QueueEventPayload>.map( + encode: _encodeQueueEventPayloadMap, + decode: _QueueEventPayload.fromJson, + typeName: '_QueueEventPayload', + ), + ), + isA<_QueueEventPayload>() + .having((value) => value.orderId, 'orderId', 'o-2b') + .having((value) => value.status, 'status', 'codec'), + ); + }, + ); + + test( + 'emitVersionedJson publishes DTO payloads with a persisted schema ' + 'version', + () async { + final listener = QueueEvents( + broker: broker, + queue: 'orders', + consumerName: 'orders-listener-versioned', + ); + await listener.start(); + addTearDown(listener.close); + + final received = listener + .on('order.versioned') + .first + .timeout(const Duration(seconds: 5)); + + final eventId = await producer.emitVersionedJson( + 'orders', + 'order.versioned', + const _QueueEventPayload(orderId: 'o-3', status: 'versioned'), + version: 2, + ); + + final event = await received; + expect(event.id, eventId); + expect(event.payload, { + PayloadCodec.versionKey: 2, + 'orderId': 'o-3', + 'status': 'versioned', + }); + expect( + event.payloadVersionedJson<_QueueEventPayload>( + version: 2, + decode: _QueueEventPayload.fromVersionedJson, + ), + isA<_QueueEventPayload>() + .having((value) => value.orderId, 'orderId', 'o-3') + .having((value) => value.status, 'status', 'versioned'), + ); + }, + ); + test('validates queue and event names', () async { expect( () => producer.emit('', 'evt'), @@ -134,3 +277,59 @@ void main() { }); }); } + +class _QueueEventPayload { + const _QueueEventPayload({ + required this.orderId, + required this.status, + }); + + factory _QueueEventPayload.fromJson(Map json) { + return _QueueEventPayload( + orderId: json['orderId'] as String, + status: json['status'] as String, + ); + } + + factory _QueueEventPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _QueueEventPayload( + orderId: json['orderId'] as String, + status: json['status'] as String, + ); + } + + final String orderId; + final String status; + + Map toJson() => { + 'orderId': orderId, + 'status': status, + }; +} + +Object? _encodeQueueEventPayloadMap(_QueueEventPayload value) => { + ...value.toJson(), + 'kind': 'custom', +}; + +class _QueueEventMeta { + const _QueueEventMeta({required this.tenant}); + + factory _QueueEventMeta.fromJson(Map json) { + return _QueueEventMeta(tenant: json['tenant'] as String); + } + + factory _QueueEventMeta.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _QueueEventMeta.fromJson(json); + } + + final String tenant; +} diff --git a/packages/stem/test/unit/core/stem_core_test.dart b/packages/stem/test/unit/core/stem_core_test.dart index eda93ff9..a8cd359c 100644 --- a/packages/stem/test/unit/core/stem_core_test.dart +++ b/packages/stem/test/unit/core/stem_core_test.dart @@ -35,6 +35,47 @@ void main() { expect(copy.queue, equals('emails')); expect(copy.meta, equals({'foo': 'bar'})); }); + + test('decodes whole args and meta DTO payloads', () { + final envelope = Envelope( + name: 'example', + args: const { + PayloadCodec.versionKey: 2, + 'value': 42, + }, + meta: const { + PayloadCodec.versionKey: 2, + 'label': 'queued', + }, + ); + + expect( + envelope.argsJson<_EnvelopeArgs>(decode: _EnvelopeArgs.fromJson).value, + 42, + ); + expect( + envelope + .argsVersionedJson<_EnvelopeArgs>( + version: 2, + decode: _EnvelopeArgs.fromVersionedJson, + ) + .value, + 42, + ); + expect( + envelope.metaJson<_EnvelopeMeta>(decode: _EnvelopeMeta.fromJson).label, + 'queued', + ); + expect( + envelope + .metaVersionedJson<_EnvelopeMeta>( + version: 2, + decode: _EnvelopeMeta.fromVersionedJson, + ) + .label, + 'queued', + ); + }); }); group('StemConfig', () { @@ -63,7 +104,7 @@ void main() { final stem = Stem( broker: broker, backend: backend, - tasks: [_StubTaskHandler()], + tasks: [const _StubTaskHandler()], ); final id = await stem.enqueue( @@ -77,9 +118,777 @@ void main() { expect(backend.records.single.id, equals(id)); expect(backend.records.single.state, equals(TaskState.queued)); }); + + test( + 'enqueueCall publishes typed calls without requiring registry handlers', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<({String value}), Object?>( + name: 'sample.typed', + encodeArgs: (args) => {'value': args.value}, + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final id = await stem.enqueueCall(definition.buildCall((value: 'ok'))); + + expect(id, isNotEmpty); + expect(broker.published.single.envelope.name, 'sample.typed'); + expect(broker.published.single.envelope.queue, 'typed'); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }, + ); + + test('enqueueCall publishes codec-backed task definitions', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<_CodecTaskArgs, Object?>.codec( + name: 'sample.codec.args', + argsCodec: _codecTaskArgsCodec, + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect(id, isNotEmpty); + expect(broker.published.single.envelope.name, 'sample.codec.args'); + expect(broker.published.single.envelope.queue, 'typed'); + expect(broker.published.single.envelope.args, {'value': 'encoded'}); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }); + + test('enqueueCall publishes json-backed task definitions', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<_CodecTaskArgs, Object?>.json( + name: 'sample.json.args', + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect(id, isNotEmpty); + expect(broker.published.single.envelope.name, 'sample.json.args'); + expect(broker.published.single.envelope.queue, 'typed'); + expect(broker.published.single.envelope.args, {'value': 'encoded'}); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }); + + test('enqueueCall publishes versioned-json task definitions', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<_CodecTaskArgs, Object?>.versionedJson( + name: 'sample.versioned.json.args', + version: 2, + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect(id, isNotEmpty); + expect( + broker.published.single.envelope.name, + 'sample.versioned.json.args', + ); + expect(broker.published.single.envelope.queue, 'typed'); + expect(broker.published.single.envelope.args, { + PayloadCodec.versionKey: 2, + 'value': 'encoded', + }); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }); + + test('enqueueCall publishes versioned-map task definitions', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<_CodecTaskArgs, Object?>.versionedMap( + name: 'sample.versioned.map.args', + version: 3, + encodeArgs: (args) => {'legacy_value': args.value}, + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect(id, isNotEmpty); + expect( + broker.published.single.envelope.name, + 'sample.versioned.map.args', + ); + expect(broker.published.single.envelope.queue, 'typed'); + expect(broker.published.single.envelope.args, { + PayloadCodec.versionKey: 3, + 'legacy_value': 'encoded', + }); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }); + + test('enqueueJson publishes DTO args without a manual map', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem( + broker: broker, + backend: backend, + tasks: [const _StubTaskHandler()], + ); + + final id = await stem.enqueueJson( + 'sample.task', + const _CodecTaskArgs('encoded'), + ); + + expect(id, isNotEmpty); + expect(broker.published.single.envelope.args, {'value': 'encoded'}); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }); + + test( + 'enqueueVersionedJson publishes DTO args with a persisted schema version', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem( + broker: broker, + backend: backend, + tasks: [const _StubTaskHandler()], + ); + + final id = await stem.enqueueVersionedJson( + 'sample.task', + const _CodecTaskArgs('encoded'), + version: 2, + ); + + expect(id, isNotEmpty); + expect(broker.published.single.envelope.args, { + PayloadCodec.versionKey: 2, + 'value': 'encoded', + }); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }, + ); + + test( + 'enqueueCall uses definition encoder metadata on producer-only paths', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem( + broker: broker, + backend: backend, + encoderRegistry: ensureTaskPayloadEncoderRegistry( + null, + additionalEncoders: [_codecReceiptEncoder, _passthroughMapEncoder], + ), + ); + final definition = TaskDefinition<({String value}), _CodecReceipt>( + name: 'sample.typed.encoded', + encodeArgs: (args) => {'value': args.value}, + metadata: const TaskMetadata( + argsEncoder: _passthroughMapEncoder, + resultEncoder: _codecReceiptEncoder, + ), + ); + + final id = await stem.enqueueCall( + definition.buildCall((value: 'encoded')), + ); + + expect( + broker.published.single.envelope.headers[stemArgsEncoderHeader], + _passthroughMapEncoder.id, + ); + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + _codecReceiptEncoder.id, + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'codec-backed task definitions attach result encoder metadata by default', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<_CodecTaskArgs, _CodecReceipt>.codec( + name: 'sample.codec.result', + argsCodec: _codecTaskArgsCodec, + resultCodec: _codecReceiptCodec, + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'versioned json task definitions can derive versioned result metadata', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = + TaskDefinition<_CodecTaskArgs, _CodecReceipt>.versionedJson( + name: 'sample.versioned_json.result', + version: 2, + decodeResultVersionedJson: _CodecReceipt.fromVersionedJson, + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'versioned json registry task definitions can derive versioned result ' + 'metadata', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = + TaskDefinition<_CodecTaskArgs, _CodecReceipt>.versionedJsonRegistry( + name: 'sample.versioned_json.registry.result', + version: 2, + resultRegistry: _codecReceiptRegistry, + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'versioned map task definitions can derive versioned result metadata', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = + TaskDefinition<_CodecTaskArgs, _CodecReceipt>.versionedMap( + name: 'sample.versioned_map.result', + version: 2, + encodeArgs: (args) => {'legacy_value': args.value}, + decodeResultVersionedJson: _CodecReceipt.fromVersionedJson, + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'json task definitions can derive versioned result metadata', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<_CodecTaskArgs, _CodecReceipt>.json( + name: 'sample.json.result.versioned', + decodeResultVersionedJson: _CodecReceipt.fromVersionedJson, + defaultDecodeVersion: 2, + ); + + final id = await stem.enqueueCall( + definition.buildCall(const _CodecTaskArgs('encoded')), + ); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'enqueueCall publishes no-arg definitions without fake empty maps', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition.noArgs( + name: 'sample.no_args', + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final id = await definition.enqueue(stem); + + expect(id, isNotEmpty); + expect(broker.published.single.envelope.name, 'sample.no_args'); + expect(broker.published.single.envelope.queue, 'typed'); + expect(broker.published.single.envelope.args, isEmpty); + expect(backend.records.single.id, id); + expect(backend.records.single.state, TaskState.queued); + }, + ); + + test('uses handler default queue when raw enqueue omits options', () async { + final broker = _RecordingBroker(); + final stem = Stem( + broker: broker, + tasks: [ + const _StubTaskHandler( + options: TaskOptions(queue: 'emails'), + ), + ], + ); + + await stem.enqueue('sample.task', args: {'value': 'ok'}); + + expect(broker.published.single.envelope.queue, 'emails'); + }); + + test( + 'uses handler publish defaults for priority visibility and retry policy', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem( + broker: broker, + backend: backend, + tasks: [ + const _StubTaskHandler( + options: TaskOptions( + queue: 'emails', + priority: 7, + visibilityTimeout: Duration(seconds: 45), + retryPolicy: TaskRetryPolicy(maxRetries: 9), + ), + ), + ], + ); + + await stem.enqueue('sample.task', args: {'value': 'ok'}); + + expect(broker.published.single.envelope.queue, 'emails'); + expect(broker.published.single.envelope.priority, 7); + expect( + broker.published.single.envelope.visibilityTimeout, + const Duration(seconds: 45), + ); + expect(broker.published.single.envelope.maxRetries, 9); + expect( + backend.records.single.meta['stem.retryPolicy'], + containsPair('maxRetries', 9), + ); + }, + ); + + test('explicit task options override handler defaults', () async { + final broker = _RecordingBroker(); + final stem = Stem( + broker: broker, + tasks: [ + const _StubTaskHandler( + options: TaskOptions(queue: 'emails', priority: 7), + ), + ], + ); + + await stem.enqueue( + 'sample.task', + args: {'value': 'ok'}, + options: const TaskOptions(queue: 'custom', priority: 3), + ); + + expect(broker.published.single.envelope.queue, 'custom'); + expect(broker.published.single.envelope.priority, 3); + }); + + test('enqueue options override handler routing defaults', () async { + final broker = _RecordingBroker(); + final stem = Stem( + broker: broker, + tasks: [ + const _StubTaskHandler( + options: TaskOptions(queue: 'emails', priority: 7), + ), + ], + ); + + await stem.enqueue( + 'sample.task', + args: {'value': 'ok'}, + enqueueOptions: const TaskEnqueueOptions(queue: 'audit', priority: 5), + ); + + expect(broker.published.single.envelope.queue, 'audit'); + expect(broker.published.single.envelope.priority, 5); + }); + + test( + 'no-arg task definitions can attach codec-backed result metadata', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition.noArgsCodec<_CodecReceipt>( + name: 'sample.no_args.codec', + resultCodec: _codecReceiptCodec, + ); + + final id = await definition.enqueue(stem); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'no-arg task definitions can derive json-backed result metadata', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition.noArgsJson<_CodecReceipt>( + name: 'sample.no_args.json', + decodeResult: _CodecReceipt.fromJson, + ); + + final id = await definition.enqueue(stem); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + + test( + 'no-arg task definitions can derive versioned json-backed result' + ' metadata', + () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition.noArgsVersionedJson<_CodecReceipt>( + name: 'sample.no_args.versioned_json', + version: 2, + decodeResult: _CodecReceipt.fromVersionedJson, + ); + + final id = await definition.enqueue(stem); + + expect( + backend.records.single.meta[stemResultEncoderMetaKey], + endsWith('.result.codec'), + ); + expect(backend.records.single.id, id); + }, + ); + }); + + group('TaskCall helpers', () { + test('TaskDefinition.enqueue enqueues typed args directly', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<({String value}), String>( + name: 'sample.task_definition_enqueue', + encodeArgs: (args) => {'value': args.value}, + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final taskId = await TaskEnqueueScope.run({'traceId': 'scope-1'}, () { + return definition.enqueue(stem, (value: 'ok')); + }); + + expect(taskId, isNotEmpty); + expect( + broker.published.single.envelope.name, + 'sample.task_definition_enqueue', + ); + expect(broker.published.single.envelope.queue, 'typed'); + expect( + broker.published.single.envelope.meta, + containsPair('traceId', 'scope-1'), + ); + }); + + test('enqueue enqueues typed calls with scoped metadata', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<({String value}), String>( + name: 'sample.task_call', + encodeArgs: (args) => {'value': args.value}, + defaultOptions: const TaskOptions(queue: 'typed'), + ); + + final taskId = await TaskEnqueueScope.run({'traceId': 'scope-1'}, () { + return definition.enqueue(stem, (value: 'ok')); + }); + + expect(taskId, isNotEmpty); + expect(broker.published.single.envelope.name, 'sample.task_call'); + expect(broker.published.single.envelope.queue, 'typed'); + expect( + broker.published.single.envelope.meta, + containsPair('traceId', 'scope-1'), + ); + }); + + test('enqueueAndWait returns typed results', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<({String value}), String>( + name: 'sample.task_call_wait', + encodeArgs: (args) => {'value': args.value}, + ); + + unawaited( + Future(() async { + while (broker.published.isEmpty) { + await Future.delayed(Duration.zero); + } + final taskId = broker.published.single.envelope.id; + await backend.set(taskId, TaskState.succeeded, payload: 'done'); + }), + ); + + final result = await definition.enqueueAndWait( + stem, + (value: 'ok'), + timeout: const Duration(seconds: 1), + ); + + expect(result?.isSucceeded, isTrue); + expect(result?.value, 'done'); + }); + + test('TaskDefinition.enqueueAndWait returns typed results', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition<({String value}), String>( + name: 'sample.task_definition_wait', + encodeArgs: (args) => {'value': args.value}, + ); + + unawaited( + Future(() async { + while (broker.published.isEmpty) { + await Future.delayed(Duration.zero); + } + final taskId = broker.published.single.envelope.id; + await backend.set(taskId, TaskState.succeeded, payload: 'done'); + }), + ); + + final result = await definition.enqueueAndWait( + stem, + (value: 'ok'), + timeout: const Duration(seconds: 1), + ); + + expect(result?.isSucceeded, isTrue); + expect(result?.value, 'done'); + }); + }); + + group('TaskDefinition.waitFor', () { + test('uses definition decoding rules', () async { + final backend = _codecAwareBackend(); + final stem = _codecAwareStem(backend); + + await backend.set( + 'task-definition-wait', + TaskState.succeeded, + payload: const _CodecReceipt('receipt-definition'), + meta: {stemResultEncoderMetaKey: _codecReceiptEncoder.id}, + ); + + final result = await _codecReceiptDefinition.waitFor( + stem, + 'task-definition-wait', + ); + + expect(result?.value?.id, 'receipt-definition'); + expect(result?.rawPayload, isA<_CodecReceipt>()); + }); + + test('supports no-arg task definitions', () async { + final backend = InMemoryResultBackend(); + final stem = Stem(broker: _RecordingBroker(), backend: backend); + final definition = TaskDefinition.noArgsJson<_CodecReceipt>( + name: 'no-args.wait', + decodeResult: _CodecReceipt.fromJson, + ); + + await backend.set( + 'task-no-args-wait', + TaskState.succeeded, + payload: const _CodecReceipt('done'), + meta: {stemResultEncoderMetaKey: _codecReceiptEncoder.id}, + ); + + final result = await definition.waitFor(stem, 'task-no-args-wait'); + + expect(result?.value?.id, 'done'); + expect(result?.rawPayload, isA<_CodecReceipt>()); + }); + + test('supports versioned no-arg task definitions', () async { + final backend = InMemoryResultBackend(); + final stem = Stem(broker: _RecordingBroker(), backend: backend); + final definition = TaskDefinition.noArgsVersionedJson<_CodecReceipt>( + name: 'no-args.versioned.wait', + version: 2, + decodeResult: _CodecReceipt.fromVersionedJson, + ); + + await backend.set( + 'task-no-args-versioned-wait', + TaskState.succeeded, + payload: {'id': 'done', PayloadCodec.versionKey: 2}, + meta: {stemResultEncoderMetaKey: _codecReceiptEncoder.id}, + ); + + final result = await definition.waitFor( + stem, + 'task-no-args-versioned-wait', + ); + + expect(result?.value?.id, 'done-v2'); + expect(result?.rawPayload, isA>()); + }); + + test('supports versioned argful task definitions', () async { + final backend = InMemoryResultBackend(); + final stem = Stem(broker: _RecordingBroker(), backend: backend); + final definition = + TaskDefinition<_CodecTaskArgs, _CodecReceipt>.versionedJson( + name: 'args.versioned.wait', + version: 2, + decodeResultVersionedJson: _CodecReceipt.fromVersionedJson, + ); + + await backend.set( + 'task-args-versioned-wait', + TaskState.succeeded, + payload: {'id': 'done', PayloadCodec.versionKey: 2}, + meta: {stemResultEncoderMetaKey: _codecReceiptEncoder.id}, + ); + + final result = await definition.waitFor( + stem, + 'task-args-versioned-wait', + ); + + expect(result?.value?.id, 'done-v2'); + expect(result?.rawPayload, isA>()); + }); + + test( + 'supports json argful task definitions with versioned results', + () async { + final backend = InMemoryResultBackend(); + final stem = Stem(broker: _RecordingBroker(), backend: backend); + final definition = TaskDefinition<_CodecTaskArgs, _CodecReceipt>.json( + name: 'args.json.versioned.wait', + decodeResultVersionedJson: _CodecReceipt.fromVersionedJson, + defaultDecodeVersion: 2, + ); + + await backend.set( + 'task-args-json-versioned-wait', + TaskState.succeeded, + payload: {'id': 'done', PayloadCodec.versionKey: 2}, + meta: {stemResultEncoderMetaKey: _codecReceiptEncoder.id}, + ); + + final result = await definition.waitFor( + stem, + 'task-args-json-versioned-wait', + ); + + expect(result?.value?.id, 'done-v2'); + expect(result?.rawPayload, isA>()); + }); + + test('enqueueAndWait supports no-arg task definitions', () async { + final broker = _RecordingBroker(); + final backend = _RecordingBackend(); + final stem = Stem(broker: broker, backend: backend); + final definition = TaskDefinition.noArgs(name: 'no-args.enqueue'); + + unawaited( + Future(() async { + while (broker.published.isEmpty) { + await Future.delayed(Duration.zero); + } + final taskId = broker.published.single.envelope.id; + await backend.set(taskId, TaskState.succeeded, payload: 'done'); + }), + ); + + final result = await definition.enqueueAndWait( + stem, + timeout: const Duration(seconds: 1), + ); + + expect(result?.value, 'done'); + expect(result?.rawPayload, 'done'); + }); }); - group('Stem.waitForTaskDefinition', () { + group('TaskDefinition.waitFor', () { test('does not double decode codec-backed terminal results', () async { final backend = _codecAwareBackend(); final stem = _codecAwareStem(backend); @@ -91,9 +900,9 @@ void main() { meta: {stemResultEncoderMetaKey: _codecReceiptEncoder.id}, ); - final result = await stem.waitForTaskDefinition( + final result = await _codecReceiptDefinition.waitFor( + stem, 'task-terminal', - _codecReceiptDefinition, ); expect(result?.value?.id, 'receipt-terminal'); @@ -115,9 +924,9 @@ void main() { }), ); - final result = await stem.waitForTaskDefinition( + final result = await _codecReceiptDefinition.waitFor( + stem, 'task-watched', - _codecReceiptDefinition, timeout: const Duration(seconds: 1), ); @@ -125,6 +934,54 @@ void main() { expect(result?.rawPayload, isA<_CodecReceipt>()); }); }); + + group('Stem.waitForTask', () { + test('supports decodeJson for low-level DTO waits', () async { + final backend = InMemoryResultBackend(); + final stem = Stem(broker: _RecordingBroker(), backend: backend); + + await backend.set( + 'task-json-wait', + TaskState.succeeded, + payload: const {'id': 'receipt-json'}, + ); + + final result = await stem.waitForTask<_CodecReceipt>( + 'task-json-wait', + decodeJson: _CodecReceipt.fromJson, + ); + + expect(result?.isSucceeded, isTrue); + expect(result?.requiredValue().id, 'receipt-json'); + expect(result?.rawPayload, const {'id': 'receipt-json'}); + }); + + test('supports decodeVersionedJson for low-level DTO waits', () async { + final backend = InMemoryResultBackend(); + final stem = Stem(broker: _RecordingBroker(), backend: backend); + + await backend.set( + 'task-versioned-json-wait', + TaskState.succeeded, + payload: const { + PayloadCodec.versionKey: 2, + 'id': 'receipt-versioned-json', + }, + ); + + final result = await stem.waitForTask<_CodecReceipt>( + 'task-versioned-json-wait', + decodeVersionedJson: _CodecReceipt.fromVersionedJson, + ); + + expect(result?.isSucceeded, isTrue); + expect(result?.requiredValue().id, 'receipt-versioned-json-v2'); + expect(result?.rawPayload, const { + PayloadCodec.versionKey: 2, + 'id': 'receipt-versioned-json', + }); + }); + }); } ResultBackend _codecAwareBackend() { @@ -153,14 +1010,69 @@ class _CodecReceipt { return _CodecReceipt(json['id']! as String); } + factory _CodecReceipt.fromVersionedJson( + Map json, + int version, + ) { + return _CodecReceipt('${json['id']! as String}-v$version'); + } + + factory _CodecReceipt.fromV2Json(Map json) { + return _CodecReceipt('${json['id']! as String}-v2'); + } + final String id; Map toJson() => {'id': id}; } -const _codecReceiptCodec = PayloadCodec<_CodecReceipt>( - encode: _encodeCodecReceipt, - decode: _decodeCodecReceipt, +class _EnvelopeArgs { + const _EnvelopeArgs(this.value); + + factory _EnvelopeArgs.fromJson(Map json) { + return _EnvelopeArgs(json['value'] as int); + } + + factory _EnvelopeArgs.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _EnvelopeArgs.fromJson(json); + } + + final int value; +} + +class _EnvelopeMeta { + const _EnvelopeMeta(this.label); + + factory _EnvelopeMeta.fromJson(Map json) { + return _EnvelopeMeta(json['label'] as String); + } + + factory _EnvelopeMeta.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _EnvelopeMeta.fromJson(json); + } + + final String label; +} + +const _codecReceiptCodec = PayloadCodec<_CodecReceipt>.json( + decode: _CodecReceipt.fromJson, + typeName: '_CodecReceipt', +); + +const _codecReceiptRegistry = PayloadVersionRegistry<_CodecReceipt>( + decoders: )>{ + 1: _CodecReceipt.fromJson, + 2: _CodecReceipt.fromV2Json, + }, + defaultVersion: 1, ); const _codecReceiptEncoder = CodecTaskPayloadEncoder<_CodecReceipt>( @@ -168,6 +1080,8 @@ const _codecReceiptEncoder = CodecTaskPayloadEncoder<_CodecReceipt>( codec: _codecReceiptCodec, ); +const _passthroughMapEncoder = _MapPassthroughEncoder('test.args.map'); + final _codecReceiptDefinition = TaskDefinition, _CodecReceipt>( name: 'codec.receipt', @@ -175,21 +1089,45 @@ final _codecReceiptDefinition = decodeResult: _codecReceiptCodec.decode, ); -Object? _encodeCodecReceipt(_CodecReceipt value) => value.toJson(); +class _CodecTaskArgs { + const _CodecTaskArgs(this.value); + + final String value; -_CodecReceipt _decodeCodecReceipt(Object? payload) { - return _CodecReceipt.fromJson(Map.from(payload! as Map)); + Map toJson() => {'value': value}; +} + +const _codecTaskArgsCodec = PayloadCodec<_CodecTaskArgs>.map( + encode: _encodeCodecTaskArgs, + decode: _decodeCodecTaskArgs, + typeName: '_CodecTaskArgs', +); + +Object? _encodeCodecTaskArgs(_CodecTaskArgs value) => value.toJson(); + +_CodecTaskArgs _decodeCodecTaskArgs(Object? payload) { + final map = Map.from(payload! as Map); + return _CodecTaskArgs(map['value']! as String); } class _StubTaskHandler implements TaskHandler { + const _StubTaskHandler({ + TaskOptions options = const TaskOptions(), + TaskMetadata metadata = const TaskMetadata(), + }) : _taskOptions = options, + _taskMetadata = metadata; + + final TaskOptions _taskOptions; + final TaskMetadata _taskMetadata; + @override String get name => 'sample.task'; @override - TaskOptions get options => const TaskOptions(); + TaskOptions get options => _taskOptions; @override - TaskMetadata get metadata => const TaskMetadata(); + TaskMetadata get metadata => _taskMetadata; @override TaskEntrypoint? get isolateEntrypoint => null; @@ -277,6 +1215,19 @@ class _RecordingBroker implements Broker { Future close() async {} } +class _MapPassthroughEncoder implements TaskPayloadEncoder { + const _MapPassthroughEncoder(this.id); + + @override + final String id; + + @override + Object? encode(Object? value) => value; + + @override + Object? decode(Object? value) => value; +} + class _RecordingBackend implements ResultBackend { final List records = []; final Map> _controllers = {}; diff --git a/packages/stem/test/unit/core/stem_event_test.dart b/packages/stem/test/unit/core/stem_event_test.dart index 8ad552d9..8dcc0e78 100644 --- a/packages/stem/test/unit/core/stem_event_test.dart +++ b/packages/stem/test/unit/core/stem_event_test.dart @@ -12,6 +12,32 @@ void main() { expect(event.attributes, isA>()); }); + test('WorkerEvent exposes typed data helpers', () { + final event = WorkerEvent( + type: WorkerEventType.completed, + data: const { + 'retry': {'delayMs': 250}, + PayloadCodec.versionKey: 2, + 'delayMs': 250, + }, + ); + + expect(event.dataValue('delayMs'), 250); + expect(event.dataValueOr('missing', 'fallback'), 'fallback'); + expect(event.requiredDataValue('delayMs'), 250); + expect( + event.dataJson<_RetryData>(decode: _RetryData.fromJson), + isA<_RetryData>().having((value) => value.delayMs, 'delayMs', 250), + ); + expect( + event.dataVersionedJson<_RetryData>( + version: 2, + decode: _RetryData.fromVersionedJson, + ), + isA<_RetryData>().having((value) => value.delayMs, 'delayMs', 250), + ); + }); + test('QueueCustomEvent implements StemEvent contract', () { final event = QueueCustomEvent( id: 'evt-1', @@ -42,5 +68,234 @@ void main() { expect(event.attributes['runId'], 'run-1'); expect(event.attributes['stepId'], 'charge'); }); + + test('WorkflowStepEvent decodes DTO result payloads', () { + final event = WorkflowStepEvent( + runId: 'run-2', + workflow: 'checkout', + stepId: 'charge', + type: WorkflowStepEventType.completed, + timestamp: DateTime.utc(2026, 2, 24, 16, 30), + result: const {'chargeId': 'ch_123'}, + ); + + expect( + event.resultJson<_ChargeResult>(decode: _ChargeResult.fromJson), + isA<_ChargeResult>().having( + (value) => value.chargeId, + 'chargeId', + 'ch_123', + ), + ); + expect( + event.resultVersionedJson<_ChargeResult>( + version: 2, + decode: _ChargeResult.fromVersionedJson, + ), + isA<_ChargeResult>().having( + (value) => value.chargeId, + 'chargeId', + 'ch_123', + ), + ); + }); + + test('WorkflowStepEvent exposes typed metadata helpers', () { + final event = WorkflowStepEvent( + runId: 'run-3', + workflow: 'checkout', + stepId: 'charge', + type: WorkflowStepEventType.completed, + timestamp: DateTime.utc(2026, 2, 24, 16, 45), + metadata: const { + 'worker': { + PayloadCodec.versionKey: 2, + 'workerId': 'worker-1', + }, + PayloadCodec.versionKey: 2, + 'workerId': 'worker-1', + }, + ); + + expect(event.metadataValue>('worker'), isNotNull); + expect( + event.metadataJson<_StepMetadata>( + 'worker', + decode: _StepMetadata.fromJson, + ), + isA<_StepMetadata>().having( + (value) => value.workerId, + 'workerId', + 'worker-1', + ), + ); + expect( + event.metadataVersionedJson<_StepMetadata>( + 'worker', + defaultVersion: 2, + decode: _StepMetadata.fromVersionedJson, + ), + isA<_StepMetadata>().having( + (value) => value.workerId, + 'workerId', + 'worker-1', + ), + ); + expect( + event.metadataPayloadJson<_StepMetadata>( + decode: _StepMetadata.fromJson, + ), + isA<_StepMetadata>().having( + (value) => value.workerId, + 'workerId', + 'worker-1', + ), + ); + expect( + event.metadataPayloadVersionedJson<_StepMetadata>( + defaultVersion: 2, + decode: _StepMetadata.fromVersionedJson, + ), + isA<_StepMetadata>().having( + (value) => value.workerId, + 'workerId', + 'worker-1', + ), + ); + }); + + test('WorkflowRuntimeEvent exposes typed metadata helpers', () { + final event = WorkflowRuntimeEvent( + runId: 'run-4', + workflow: 'checkout', + type: WorkflowRuntimeEventType.continuationEnqueued, + timestamp: DateTime.utc(2026, 2, 24, 17), + metadata: const { + 'detail': { + PayloadCodec.versionKey: 2, + 'reason': 'resume', + }, + PayloadCodec.versionKey: 2, + 'reason': 'resume', + }, + ); + + expect( + event.metadataJson<_RuntimeMetadata>( + 'detail', + decode: _RuntimeMetadata.fromJson, + ), + isA<_RuntimeMetadata>().having( + (value) => value.reason, + 'reason', + 'resume', + ), + ); + expect( + event.metadataVersionedJson<_RuntimeMetadata>( + 'detail', + defaultVersion: 2, + decode: _RuntimeMetadata.fromVersionedJson, + ), + isA<_RuntimeMetadata>().having( + (value) => value.reason, + 'reason', + 'resume', + ), + ); + expect( + event.metadataPayloadJson<_RuntimeMetadata>( + decode: _RuntimeMetadata.fromJson, + ), + isA<_RuntimeMetadata>().having( + (value) => value.reason, + 'reason', + 'resume', + ), + ); + expect( + event.metadataPayloadVersionedJson<_RuntimeMetadata>( + defaultVersion: 2, + decode: _RuntimeMetadata.fromVersionedJson, + ), + isA<_RuntimeMetadata>().having( + (value) => value.reason, + 'reason', + 'resume', + ), + ); + }); }); } + +class _ChargeResult { + const _ChargeResult({required this.chargeId}); + + factory _ChargeResult.fromJson(Map json) { + return _ChargeResult(chargeId: json['chargeId'] as String); + } + + factory _ChargeResult.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ChargeResult(chargeId: json['chargeId'] as String); + } + + final String chargeId; +} + +class _RetryData { + const _RetryData({required this.delayMs}); + + factory _RetryData.fromJson(Map json) { + return _RetryData(delayMs: json['delayMs'] as int); + } + + factory _RetryData.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _RetryData.fromJson(json); + } + + final int delayMs; +} + +class _StepMetadata { + const _StepMetadata({required this.workerId}); + + factory _StepMetadata.fromJson(Map json) { + return _StepMetadata(workerId: json['workerId'] as String); + } + + factory _StepMetadata.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _StepMetadata.fromJson(json); + } + + final String workerId; +} + +class _RuntimeMetadata { + const _RuntimeMetadata({required this.reason}); + + factory _RuntimeMetadata.fromJson(Map json) { + return _RuntimeMetadata(reason: json['reason'] as String); + } + + factory _RuntimeMetadata.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _RuntimeMetadata.fromJson(json); + } + + final String reason; +} diff --git a/packages/stem/test/unit/core/task_context_enqueue_test.dart b/packages/stem/test/unit/core/task_context_enqueue_test.dart index f7d6aa52..e93346ea 100644 --- a/packages/stem/test/unit/core/task_context_enqueue_test.dart +++ b/packages/stem/test/unit/core/task_context_enqueue_test.dart @@ -9,6 +9,39 @@ const _parentAttemptKey = 'stem.parentAttempt'; void main() { group('TaskContext.enqueue', () { + test('exposes typed arg readers on the context', () async { + final TaskExecutionContext context = TaskContext( + id: 'parent-0', + args: const {'invoiceId': 'inv-42'}, + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + ); + + expect(context.requiredArg('invoiceId'), equals('inv-42')); + expect(context.argOr('tenant', 'global'), equals('global')); + }); + + test('reports progress with JSON DTO payloads', () async { + Object? progressData; + final TaskExecutionContext context = TaskContext( + id: 'parent-0b', + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async => progressData = data, + ); + + await context.progressJson(50, const _ProgressUpdate(stage: 'warming')); + + expect(progressData, equals(const {'stage': 'warming'})); + }); + test('propagates headers/meta and lineage by default', () async { final enqueuer = _RecordingEnqueuer(); final context = TaskContext( @@ -64,6 +97,25 @@ void main() { expect(record.meta.containsKey(_parentAttemptKey), isFalse); }); + test('spawn forwards notBefore', () async { + final enqueuer = _RecordingEnqueuer(); + final TaskExecutionContext context = TaskContext( + id: 'parent-2b', + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + enqueuer: enqueuer, + ); + final scheduledAt = DateTime.now().add(const Duration(minutes: 1)); + + await context.spawn('tasks.child', notBefore: scheduledAt); + + expect(enqueuer.last?.notBefore, scheduledAt); + }); + test('spawn delegates to enqueue semantics', () async { final enqueuer = _RecordingEnqueuer(); final context = TaskContext( @@ -84,6 +136,38 @@ void main() { expect(enqueuer.records.single.args, equals({'value': 42})); }); + test( + 'enqueueValue encodes typed payloads through the supplied codec', + () async { + final enqueuer = _RecordingEnqueuer(); + final context = TaskContext( + id: 'parent-3b', + attempt: 1, + headers: const {'x-trace-id': 'trace-2'}, + meta: const {'tenant': 'acme'}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + enqueuer: enqueuer, + ); + + await context.enqueueValue( + 'tasks.child', + const _InvitePayload(email: 'ops@example.com'), + codec: const PayloadCodec<_InvitePayload>.json( + decode: _InvitePayload.fromJson, + typeName: '_InvitePayload', + ), + ); + + final record = enqueuer.last!; + expect(record.args, equals({'email': 'ops@example.com'})); + expect(record.meta[_parentTaskIdKey], equals('parent-3b')); + expect(record.meta[_parentAttemptKey], equals(1)); + expect(record.headers['x-trace-id'], equals('trace-2')); + }, + ); + test('merges headers/meta overrides with defaults', () async { final enqueuer = _RecordingEnqueuer(); final context = TaskContext( @@ -178,31 +262,20 @@ void main() { }); }); - group('TaskInvocationContext builder', () { - test('supports fluent enqueue builder API', () async { + group('TaskInvocationContext explicit task calls', () { + test('supports explicit enqueue call overrides', () async { final enqueuer = _RecordingEnqueuer(); - final context = TaskInvocationContext.local( - id: 'invocation-1', - headers: const {}, - meta: const {}, - attempt: 0, - heartbeat: () {}, - extendLease: (_) async {}, - progress: (_, {data}) async {}, - enqueuer: enqueuer, - ); final definition = TaskDefinition<_ExampleArgs, void>( name: 'tasks.typed', encodeArgs: (args) => {'value': args.value}, ); - final builder = context.enqueueBuilder( - definition: definition, - args: const _ExampleArgs('hello'), + final call = definition.buildCall( + const _ExampleArgs('hello'), + options: const TaskOptions(queue: 'priority', priority: 7), ); - - await builder.queue('priority').priority(7).enqueueWith(context); + await enqueuer.enqueueCall(call); final record = enqueuer.last!; expect(record.name, equals('tasks.typed')); @@ -211,6 +284,153 @@ void main() { expect(record.options.priority, equals(7)); }); }); + + group('TaskContext workflows', () { + test( + 'delegates typed child workflow starts to the configured caller', + () async { + final workflows = _RecordingWorkflowCaller(); + final context = TaskContext( + id: 'parent-workflow-task', + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + workflows: workflows, + ); + final definition = WorkflowRef, String>( + name: 'workflow.child', + encodeParams: (params) => params, + ); + + final runId = await context.startWorkflowRef( + definition, + const {'value': 'child'}, + ); + final result = await context.waitForWorkflowRef( + runId, + definition, + ); + + expect(runId, 'run-1'); + expect(workflows.lastWorkflowName, 'workflow.child'); + expect(workflows.lastWorkflowParams, {'value': 'child'}); + expect(workflows.waitedRunId, 'run-1'); + expect(result?.value, 'child-result'); + }, + ); + + test('throws when no workflow caller is configured', () { + final context = TaskContext( + id: 'no-workflows', + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + ); + final definition = WorkflowRef, String>( + name: 'workflow.child', + encodeParams: (params) => params, + ); + + expect( + () => context.startWorkflowRef(definition, const {'value': 'child'}), + throwsStateError, + ); + expect( + () => context.waitForWorkflowRef('run-1', definition), + throwsStateError, + ); + }); + + test('builds child workflow starts directly from the context', () async { + final workflows = _RecordingWorkflowCaller(); + final context = TaskContext( + id: 'workflow-builder-task', + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + workflows: workflows, + ); + final definition = WorkflowRef, String>( + name: 'workflow.child', + encodeParams: (params) => params, + ); + + final call = definition.buildStart( + params: const {'value': 'child'}, + parentRunId: 'parent-task', + ); + final runId = await context.startWorkflowCall(call); + final result = await call.definition.waitFor(context, runId); + + expect(workflows.lastWorkflowName, 'workflow.child'); + expect(workflows.lastWorkflowParams, {'value': 'child'}); + expect(workflows.lastParentRunId, 'parent-task'); + expect(workflows.waitedRunId, 'run-1'); + expect(result?.value, 'child-result'); + }); + }); + + group('TaskContext workflow events', () { + test('delegates typed workflow events to the configured emitter', () async { + final workflowEvents = _RecordingWorkflowEventEmitter(); + final context = TaskContext( + id: 'event-task', + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + workflowEvents: workflowEvents, + ); + const event = WorkflowEventRef>( + topic: 'workflow.ready', + ); + + await context.emitValue('workflow.inline', const {'value': 'inline'}); + await context.emitEvent(event, const {'value': 'event'}); + + expect(workflowEvents.topics, ['workflow.inline', 'workflow.ready']); + expect(workflowEvents.payloads, [ + {'value': 'inline'}, + {'value': 'event'}, + ]); + }); + + test('throws when no workflow event emitter is configured', () { + final context = TaskContext( + id: 'no-workflow-events', + attempt: 0, + headers: const {}, + meta: const {}, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {data}) async {}, + ); + + expect( + () => context.emitValue('workflow.ready', const {'value': true}), + throwsStateError, + ); + }); + }); +} + +class _ProgressUpdate { + const _ProgressUpdate({required this.stage}); + + final String stage; + + Map toJson() => {'stage': stage}; } class _ExampleArgs { @@ -225,6 +445,7 @@ class _RecordedEnqueue { required this.headers, required this.meta, required this.options, + required this.notBefore, required this.enqueueOptions, }); @@ -233,6 +454,7 @@ class _RecordedEnqueue { final Map headers; final Map meta; final TaskOptions options; + final DateTime? notBefore; final TaskEnqueueOptions? enqueueOptions; } @@ -247,6 +469,7 @@ class _RecordingEnqueuer implements TaskEnqueuer { Map args = const {}, Map headers = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, Map meta = const {}, TaskEnqueueOptions? enqueueOptions, }) async { @@ -257,6 +480,7 @@ class _RecordingEnqueuer implements TaskEnqueuer { headers: Map.from(headers), meta: Map.from(meta), options: options, + notBefore: notBefore, enqueueOptions: enqueueOptions, ), ); @@ -273,8 +497,143 @@ class _RecordingEnqueuer implements TaskEnqueuer { args: call.encodeArgs(), headers: call.headers, options: call.resolveOptions(), + notBefore: call.notBefore, meta: call.meta, enqueueOptions: enqueueOptions, ); } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeTestTaskArgs(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } +} + +Map _encodeTestTaskArgs( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + return payload.map( + (key, value) => MapEntry(key.toString(), value), + ); + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); +} + +class _InvitePayload { + const _InvitePayload({required this.email}); + + factory _InvitePayload.fromJson(Map json) { + return _InvitePayload(email: json['email']! as String); + } + + final String email; + + Map toJson() => {'email': email}; +} + +class _RecordingWorkflowCaller implements WorkflowCaller { + String? lastWorkflowName; + Map? lastWorkflowParams; + String? lastParentRunId; + String? waitedRunId; + + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) async { + lastWorkflowName = definition.name; + lastWorkflowParams = definition.encodeParams(params); + lastParentRunId = parentRunId; + return 'run-1'; + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) { + return startWorkflowRef( + call.definition, + call.params, + parentRunId: call.parentRunId, + ttl: call.ttl, + cancellationPolicy: call.cancellationPolicy, + ); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) async { + waitedRunId = runId; + return WorkflowResult( + runId: runId, + status: WorkflowStatus.completed, + state: RunState( + id: runId, + workflow: definition.name, + status: WorkflowStatus.completed, + cursor: 0, + params: const {}, + createdAt: DateTime.utc(2026), + result: 'child-result', + ), + value: definition.decode('child-result'), + rawResult: 'child-result', + ); + } +} + +class _RecordingWorkflowEventEmitter implements WorkflowEventEmitter { + final List topics = []; + final List> payloads = >[]; + + @override + Future emitValue( + String topic, + T value, { + PayloadCodec? codec, + }) async { + topics.add(topic); + payloads.add(Map.from(value! as Map)); + } + + @override + Future emitEvent(WorkflowEventRef event, T value) { + return emitValue(event.topic, value, codec: event.codec); + } } diff --git a/packages/stem/test/unit/core/task_enqueue_builder_test.dart b/packages/stem/test/unit/core/task_enqueue_builder_test.dart index 9760d079..b79d6a79 100644 --- a/packages/stem/test/unit/core/task_enqueue_builder_test.dart +++ b/packages/stem/test/unit/core/task_enqueue_builder_test.dart @@ -1,85 +1,293 @@ -import 'package:stem/src/core/contracts.dart'; +import 'package:stem/stem.dart'; import 'package:test/test.dart'; void main() { - test('TaskEnqueueBuilder merges headers/meta and options', () { - final definition = TaskDefinition, Object?>( - name: 'demo.task', - encodeArgs: (args) => args, - encodeMeta: (args) => {'from': 'definition'}, - ); + group('TaskCall builders', () { + test('buildCall stores headers/meta and options', () { + final definition = TaskDefinition, Object?>( + name: 'demo.task', + encodeArgs: (args) => args, + encodeMeta: (args) => {'from': 'definition'}, + ); - final call = - TaskEnqueueBuilder(definition: definition, args: const {'a': 1}) - .header('h1', 'v1') - .meta('m1', 'v1') - .queue('critical') - .priority(5) - .notBefore(DateTime.parse('2025-01-01T00:00:00Z')) - .enqueueOptions(const TaskEnqueueOptions(addToParent: false)) - .build(); - - expect(call.headers['h1'], 'v1'); - expect(call.meta['from'], 'definition'); - expect(call.meta['m1'], 'v1'); - expect(call.options?.queue, 'critical'); - expect(call.options?.priority, 5); - expect(call.notBefore, DateTime.parse('2025-01-01T00:00:00Z')); - expect(call.enqueueOptions?.addToParent, isFalse); - }); + final call = definition.buildCall( + const {'a': 1}, + headers: const {'h1': 'v1'}, + options: const TaskOptions(queue: 'critical', priority: 5), + notBefore: DateTime.parse('2025-01-01T00:00:00Z'), + meta: const {'m1': 'v1'}, + enqueueOptions: const TaskEnqueueOptions(addToParent: false), + ); - test('TaskEnqueueBuilder delay sets notBefore in the future', () { - final definition = TaskDefinition, Object?>( - name: 'demo.task', - encodeArgs: (args) => args, - ); + expect(call.headers['h1'], 'v1'); + expect(call.meta['m1'], 'v1'); + expect(call.options?.queue, 'critical'); + expect(call.options?.priority, 5); + expect(call.notBefore, DateTime.parse('2025-01-01T00:00:00Z')); + expect(call.enqueueOptions?.addToParent, isFalse); + }); - final start = DateTime.now(); - final call = TaskEnqueueBuilder( - definition: definition, - args: const {'a': 1}, - ).delay(const Duration(seconds: 2)).build(); + test('buildCall preserves definition metadata by default', () { + final definition = TaskDefinition, Object?>( + name: 'demo.task', + encodeArgs: (args) => args, + encodeMeta: (args) => {'from': 'definition'}, + ); - expect(call.notBefore, isNotNull); - expect(call.notBefore!.isAfter(start), isTrue); - }); + final call = definition.buildCall(const {'a': 1}); - test('TaskEnqueueBuilder replaces headers, metadata, and options', () { - final definition = TaskDefinition, Object?>( - name: 'demo.task', - encodeArgs: (args) => args, - ); + expect(call.meta, containsPair('from', 'definition')); + }); - final call = - TaskEnqueueBuilder(definition: definition, args: const {'a': 1}) - .headers(const {'h': 'v'}) - .metadata(const {'m': 1}) - .options(const TaskOptions(queue: 'q', priority: 9)) - .build(); - - expect(call.headers, containsPair('h', 'v')); - expect(call.meta, containsPair('m', 1)); - expect(call.options?.queue, 'q'); - expect(call.options?.priority, 9); - }); + test('buildCall accepts direct headers, metadata, and options', () { + final definition = TaskDefinition, Object?>( + name: 'demo.task', + encodeArgs: (args) => args, + ); - test('TaskCall.copyWith updates headers and meta', () { - final definition = TaskDefinition, Object?>( - name: 'demo.task', - encodeArgs: (args) => args, - ); - final call = definition.call( - const {'a': 1}, - headers: const {'h': 'v'}, - meta: const {'m': 1}, + final call = definition.buildCall( + const {'a': 1}, + headers: const {'h': 'v'}, + meta: const {'m': 1}, + options: const TaskOptions(queue: 'q', priority: 9), + ); + + expect(call.headers, containsPair('h', 'v')); + expect(call.meta, containsPair('m', 1)); + expect(call.options?.queue, 'q'); + expect(call.options?.priority, 9); + }); + + test('buildCall creates an explicit transport object', () { + final definition = TaskDefinition, Object?>( + name: 'demo.task', + encodeArgs: (args) => args, + ); + + final call = definition.buildCall( + const {'a': 1}, + headers: const {'h1': 'v1'}, + options: const TaskOptions(priority: 7), + ); + + expect(call.name, 'demo.task'); + expect(call.resolveOptions().priority, 7); + expect(call.headers, containsPair('h1', 'v1')); + expect(call.encodeArgs(), containsPair('a', 1)); + }); + + test( + 'TaskCall from buildCall composes with enqueueCall and typed waits', + () async { + final definition = TaskDefinition, String>( + name: 'demo.task', + encodeArgs: (args) => args, + decodeResult: (payload) => 'decoded:$payload', + ); + final caller = _RecordingTaskResultCaller(); + final call = definition.buildCall( + const {'a': 1}, + headers: const {'h1': 'v1'}, + ); + final taskId = await caller.enqueueCall(call); + final result = await call.definition.waitFor(caller, taskId); + + expect(caller.lastCall, isNotNull); + expect(caller.lastCall!.name, 'demo.task'); + expect(caller.lastCall!.headers, containsPair('h1', 'v1')); + expect(caller.waitedTaskId, 'task-1'); + expect(result?.value, 'decoded:stored'); + }, ); - final updated = call.copyWith( - headers: const {'h2': 'v2'}, - meta: const {'m2': 2}, + test('buildCall preserves enqueuer dispatch semantics', () async { + final enqueuer = _RecordingTaskEnqueuer(); + final definition = TaskDefinition, String>( + name: 'demo.task', + encodeArgs: (args) => args, + decodeResult: (payload) => 'decoded:$payload', + ); + + final taskId = await enqueuer.enqueueCall( + definition.buildCall( + const {'a': 1}, + headers: const {'h1': 'v1'}, + options: const TaskOptions(queue: 'critical'), + ), + ); + + expect(taskId, 'task-1'); + expect(enqueuer.lastCall, isNotNull); + expect(enqueuer.lastCall!.name, 'demo.task'); + expect(enqueuer.lastCall!.headers, containsPair('h1', 'v1')); + expect(enqueuer.lastCall!.resolveOptions().queue, 'critical'); + }); + + test('TaskCall composes with typed waits', () async { + final caller = _RecordingTaskResultCaller(); + final definition = TaskDefinition, String>( + name: 'demo.task', + encodeArgs: (args) => args, + decodeResult: (payload) => 'decoded:$payload', + ); + + final call = definition.buildCall( + const {'a': 1}, + headers: const {'h1': 'v1'}, + ); + final taskId = await caller.enqueueCall(call); + final result = await call.definition.waitFor(caller, taskId); + + expect(caller.lastCall, isNotNull); + expect(caller.lastCall!.name, 'demo.task'); + expect(caller.lastCall!.headers, containsPair('h1', 'v1')); + expect(caller.waitedTaskId, 'task-1'); + expect(result?.value, 'decoded:stored'); + }); + + test('buildCall can be rebuilt with updated headers and meta', () { + final definition = TaskDefinition, Object?>( + name: 'demo.task', + encodeArgs: (args) => args, + ); + final updated = definition.buildCall( + const {'a': 1}, + headers: const {'h2': 'v2'}, + meta: const {'m2': 2}, + ); + + expect(updated.headers['h2'], 'v2'); + expect(updated.meta['m2'], 2); + }); + + test( + 'NoArgsTaskDefinition.asDefinition.buildCall builds an empty call', + () { + final definition = TaskDefinition.noArgs(name: 'demo.no_args'); + + final call = definition.asDefinition.buildCall( + (), + headers: const {'h': 'v'}, + meta: const {'m': 1}, + ); + + expect(call.name, 'demo.no_args'); + expect(call.encodeArgs(), isEmpty); + expect(call.headers, containsPair('h', 'v')); + expect(call.meta, containsPair('m', 1)); + }, ); - expect(updated.headers['h2'], 'v2'); - expect(updated.meta['m2'], 2); + test('NoArgsTaskDefinition.asDefinition.buildCall accepts overrides', () { + final definition = TaskDefinition.noArgs(name: 'demo.no_args'); + + final call = definition.asDefinition.buildCall( + (), + options: const TaskOptions(priority: 4), + ); + + expect(call.name, 'demo.no_args'); + expect(call.resolveOptions().priority, 4); + expect(call.encodeArgs(), isEmpty); + }); + + test( + 'NoArgsTaskDefinition.enqueue uses the TaskEnqueuer surface', + () async { + final definition = TaskDefinition.noArgs(name: 'demo.no_args'); + final enqueuer = _RecordingTaskEnqueuer(); + + final taskId = await definition.enqueue( + enqueuer, + headers: const {'h': 'v'}, + meta: const {'m': 1}, + ); + + expect(taskId, 'task-1'); + expect(enqueuer.lastCall, isNotNull); + expect(enqueuer.lastCall!.name, 'demo.no_args'); + expect(enqueuer.lastCall!.encodeArgs(), isEmpty); + expect(enqueuer.lastCall!.headers, containsPair('h', 'v')); + expect(enqueuer.lastCall!.meta, containsPair('m', 1)); + }, + ); }); } + +class _RecordingTaskEnqueuer implements TaskEnqueuer { + TaskCall? lastCall; + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + throw UnimplementedError('enqueue is not used in this test'); + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) async { + lastCall = call; + return 'task-1'; + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + throw UnimplementedError('enqueueValue is not used in this test'); + } +} + +class _RecordingTaskResultCaller extends _RecordingTaskEnqueuer + implements TaskResultCaller { + String? waitedTaskId; + + @override + Future getTaskStatus(String taskId) async => null; + + @override + Future getGroupStatus(String groupId) async => null; + + @override + Future?> waitForTask( + String taskId, { + Duration? timeout, + TResult Function(Object? payload)? decode, + TResult Function(Map json)? decodeJson, + TResult Function(Map json, int version)? + decodeVersionedJson, + }) async { + waitedTaskId = taskId; + final value = + decode?.call('stored') ?? + decodeJson?.call(const {'value': 'stored'}) ?? + decodeVersionedJson?.call(const {'value': 'stored'}, 1); + return TaskResult( + taskId: taskId, + status: TaskStatus( + id: taskId, + state: TaskState.succeeded, + attempt: 0, + payload: 'stored', + ), + value: value, + rawPayload: 'stored', + ); + } +} diff --git a/packages/stem/test/unit/core/task_invocation_test.dart b/packages/stem/test/unit/core/task_invocation_test.dart index 16ab5652..735ba22a 100644 --- a/packages/stem/test/unit/core/task_invocation_test.dart +++ b/packages/stem/test/unit/core/task_invocation_test.dart @@ -1,7 +1,14 @@ import 'dart:isolate'; import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/core/task_invocation.dart'; +import 'package:stem/src/workflow/core/run_state.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; +import 'package:stem/src/workflow/core/workflow_status.dart'; import 'package:test/test.dart'; class _CapturingEnqueuer implements TaskEnqueuer { @@ -12,6 +19,7 @@ class _CapturingEnqueuer implements TaskEnqueuer { Map? lastMeta; TaskEnqueueOptions? lastOptions; TaskCall? lastCall; + DateTime? lastNotBefore; @override Future enqueue( @@ -19,12 +27,14 @@ class _CapturingEnqueuer implements TaskEnqueuer { Map args = const {}, Map headers = const {}, TaskOptions options = const TaskOptions(), + DateTime? notBefore, Map meta = const {}, TaskEnqueueOptions? enqueueOptions, }) async { lastHeaders = headers; lastMeta = meta; lastOptions = enqueueOptions; + lastNotBefore = notBefore; return _taskId; } @@ -37,9 +47,289 @@ class _CapturingEnqueuer implements TaskEnqueuer { lastOptions = enqueueOptions; return _taskId; } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + Map meta = const {}, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeInvocationTaskArgs(name, value, codec: codec), + headers: headers, + options: options, + notBefore: notBefore, + meta: meta, + enqueueOptions: enqueueOptions, + ); + } +} + +Map _encodeInvocationTaskArgs( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + return payload.map((key, value) => MapEntry(key.toString(), value)); + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); +} + +class _CapturingWorkflowCaller implements WorkflowCaller { + String? lastWorkflowName; + Map? lastWorkflowParams; + String? waitedRunId; + + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) async { + lastWorkflowName = definition.name; + lastWorkflowParams = definition.encodeParams(params); + return 'run-1'; + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) { + return startWorkflowRef( + call.definition, + call.params, + parentRunId: call.parentRunId, + ttl: call.ttl, + cancellationPolicy: call.cancellationPolicy, + ); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) async { + waitedRunId = runId; + return WorkflowResult( + runId: runId, + status: WorkflowStatus.completed, + state: RunState( + id: runId, + workflow: definition.name, + status: WorkflowStatus.completed, + cursor: 0, + params: const {}, + createdAt: DateTime.utc(2026), + result: 'workflow-result', + ), + value: definition.decode('workflow-result'), + rawResult: 'workflow-result', + ); + } +} + +class _CapturingWorkflowEventEmitter implements WorkflowEventEmitter { + final List topics = []; + final List> payloads = >[]; + + @override + Future emitValue( + String topic, + T value, { + PayloadCodec? codec, + }) async { + final encoded = codec != null ? codec.encode(value) : value; + payloads.add(Map.from(encoded! as Map)); + topics.add(topic); + } + + @override + Future emitEvent(WorkflowEventRef event, T value) { + return emitValue(event.topic, value, codec: event.codec); + } } void main() { + test('TaskInvocationContext.local exposes typed arg readers', () { + final TaskExecutionContext context = TaskInvocationContext.local( + id: 'task-1', + args: const {'customerId': 'cus-42'}, + headers: const {}, + meta: const {}, + attempt: 0, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {Map? data}) async {}, + ); + + expect(context.requiredArg('customerId'), equals('cus-42')); + expect(context.argOr('tenant', 'global'), equals('global')); + }); + + test('TaskExecutionContext decodes whole task arg DTOs', () { + final TaskExecutionContext context = TaskInvocationContext.local( + id: 'task-1a', + args: const { + PayloadCodec.versionKey: 2, + 'stage': 'warming', + 'update': { + PayloadCodec.versionKey: 2, + 'stage': 'warming', + }, + }, + headers: const {}, + meta: const {}, + attempt: 0, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {Map? data}) async {}, + ); + + expect( + context.argsJson<_ProgressUpdate>(decode: _ProgressUpdate.fromJson).stage, + 'warming', + ); + expect( + context.argsAs<_ProgressUpdate>(codec: _progressUpdateCodec).stage, + 'warming', + ); + expect( + context + .argsVersionedJson<_ProgressUpdate>( + defaultVersion: 2, + decode: _ProgressUpdate.fromVersionedJson, + ) + .stage, + 'warming', + ); + expect( + context + .argVersionedJson<_ProgressUpdate>( + 'update', + defaultVersion: 2, + decode: _ProgressUpdate.fromVersionedJson, + ) + ?.stage, + 'warming', + ); + }); + + test( + 'TaskInvocationContext.local reports progress with JSON DTO payloads', + () async { + ProgressSignal? progressSignal; + final TaskExecutionContext context = TaskInvocationContext.local( + id: 'task-1b', + headers: const {}, + meta: const {}, + attempt: 0, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (percent, {Map? data}) async { + progressSignal = ProgressSignal(percent, data: data); + }, + ); + + await context.progressJson(25, const _ProgressUpdate(stage: 'warming')); + + expect(progressSignal?.data, equals(const {'stage': 'warming'})); + }, + ); + + test( + 'TaskInvocationContext.local reports progress with versioned DTO payloads', + () async { + ProgressSignal? progressSignal; + final TaskExecutionContext context = TaskInvocationContext.local( + id: 'task-1c', + headers: const {}, + meta: const {}, + attempt: 0, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (percent, {Map? data}) async { + progressSignal = ProgressSignal(percent, data: data); + }, + ); + + await context.progressVersionedJson( + 25, + const _ProgressUpdate(stage: 'warming'), + version: 2, + ); + + expect(progressSignal?.data, equals(const { + PayloadCodec.versionKey: 2, + 'stage': 'warming', + })); + }, + ); + + test('ProgressSignal exposes typed progress metadata helpers', () { + const signal = ProgressSignal( + 50, + data: { + PayloadCodec.versionKey: 2, + 'stage': 'warming', + 'step': 2, + 'update': {'stage': 'warming'}, + }, + ); + + expect(signal.dataValue('step'), 2); + expect(signal.dataValueOr('missing', 'fallback'), 'fallback'); + expect(signal.requiredDataValue('step'), 2); + expect( + signal.dataJson<_ProgressUpdate>( + 'update', + decode: _ProgressUpdate.fromJson, + ), + isA<_ProgressUpdate>().having((value) => value.stage, 'stage', 'warming'), + ); + expect( + signal.dataVersionedJson<_ProgressUpdate>( + 'update', + version: 2, + decode: _ProgressUpdate.fromVersionedJson, + ), + isA<_ProgressUpdate>().having((value) => value.stage, 'stage', 'warming'), + ); + expect( + signal.payloadAs<_ProgressUpdate>(codec: _progressUpdateCodec), + isA<_ProgressUpdate>().having((value) => value.stage, 'stage', 'warming'), + ); + expect( + signal.payloadJson<_ProgressUpdate>(decode: _ProgressUpdate.fromJson), + isA<_ProgressUpdate>().having((value) => value.stage, 'stage', 'warming'), + ); + expect( + signal.payloadVersionedJson<_ProgressUpdate>( + version: 2, + decode: _ProgressUpdate.fromVersionedJson, + ), + isA<_ProgressUpdate>().having((value) => value.stage, 'stage', 'warming'), + ); + }); + test('TaskInvocationContext.local merges headers/meta and lineage', () async { final enqueuer = _CapturingEnqueuer('task-1'); final context = TaskInvocationContext.local( @@ -67,6 +357,44 @@ void main() { expect(enqueuer.lastMeta, containsPair('stem.parentAttempt', 2)); }); + test('TaskInvocationContext.local forwards notBefore', () async { + final enqueuer = _CapturingEnqueuer('task-1'); + final context = TaskInvocationContext.local( + id: 'root-task', + headers: const {}, + meta: const {}, + attempt: 0, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {Map? data}) async {}, + enqueuer: enqueuer, + ); + final scheduledAt = DateTime.now().add(const Duration(minutes: 5)); + + await context.enqueue('child', notBefore: scheduledAt); + + expect(enqueuer.lastNotBefore, scheduledAt); + }); + + test('TaskInvocationContext.local spawn forwards notBefore', () async { + final enqueuer = _CapturingEnqueuer('task-1'); + final TaskExecutionContext context = TaskInvocationContext.local( + id: 'root-task', + headers: const {}, + meta: const {}, + attempt: 0, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {Map? data}) async {}, + enqueuer: enqueuer, + ); + final scheduledAt = DateTime.now().add(const Duration(minutes: 5)); + + await context.spawn('child', notBefore: scheduledAt); + + expect(enqueuer.lastNotBefore, scheduledAt); + }); + test('TaskInvocationContext.local throws when enqueuer missing', () async { final context = TaskInvocationContext.local( id: 'no-enqueuer', @@ -87,7 +415,7 @@ void main() { const TaskDefinition, Object?>( name: 'demo', encodeArgs: _encodeArgs, - ).call(const {'a': 1}), + ).buildCall(const {'a': 1}), ), throwsA(isA()), ); @@ -112,7 +440,7 @@ void main() { name: 'demo.call', encodeArgs: (args) => args, ); - final call = definition.call( + final call = definition.buildCall( const {'value': 1}, headers: const {'h2': 'v2'}, meta: const {'m2': 'v2'}, @@ -128,6 +456,116 @@ void main() { }, ); + test('TaskInvocationContext.local delegates typed workflow calls', () async { + final workflows = _CapturingWorkflowCaller(); + final context = TaskInvocationContext.local( + id: 'root-task', + headers: const {}, + meta: const {}, + attempt: 1, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {Map? data}) async {}, + workflows: workflows, + ); + final definition = WorkflowRef, String>( + name: 'workflow.child', + encodeParams: (params) => params, + ); + + final runId = await context.startWorkflowRef( + definition, + const {'value': 'child'}, + ); + final result = await context.waitForWorkflowRef(runId, definition); + + expect(runId, 'run-1'); + expect(workflows.lastWorkflowName, 'workflow.child'); + expect(workflows.lastWorkflowParams, {'value': 'child'}); + expect(workflows.waitedRunId, 'run-1'); + expect(result?.value, 'workflow-result'); + }); + + test('TaskInvocationContext.local delegates typed workflow events', () async { + final workflowEvents = _CapturingWorkflowEventEmitter(); + final context = TaskInvocationContext.local( + id: 'root-task', + headers: const {}, + meta: const {}, + attempt: 1, + heartbeat: () {}, + extendLease: (_) async {}, + progress: (_, {Map? data}) async {}, + workflowEvents: workflowEvents, + ); + + await context.emitValue('workflow.inline', const {'value': 'inline'}); + await context.emitEvent(_eventRef, const _WorkflowEventPayload('event')); + + expect(workflowEvents.topics, ['workflow.inline', 'workflow.ready']); + expect(workflowEvents.payloads, [ + {'value': 'inline'}, + {'value': 'event'}, + ]); + }); + + test('isolate bridge payloads expose typed DTO decode helpers', () { + const enqueue = TaskEnqueueRequest( + name: 'task.demo', + args: {PayloadCodec.versionKey: 2, 'stage': 'warming'}, + headers: {'x-trace-id': 'trace-1'}, + options: {}, + meta: {PayloadCodec.versionKey: 2, 'label': 'queued'}, + ); + const start = StartWorkflowRequest( + workflowName: 'workflow.demo', + params: {PayloadCodec.versionKey: 2, 'value': 'child'}, + ); + const wait = WaitForWorkflowResponse( + result: {PayloadCodec.versionKey: 2, 'value': 'done'}, + ); + const emit = EmitWorkflowEventRequest( + topic: 'workflow.ready', + payload: {PayloadCodec.versionKey: 2, 'value': 'event'}, + ); + + expect( + enqueue.argsVersionedJson<_ProgressUpdate>( + version: 2, + decode: _ProgressUpdate.fromVersionedJson, + ).stage, + 'warming', + ); + expect( + enqueue.metaVersionedJson<_QueueLabel>( + version: 2, + decode: _QueueLabel.fromVersionedJson, + ).label, + 'queued', + ); + expect( + start.paramsVersionedJson<_WorkflowStartPayload>( + version: 2, + decode: _WorkflowStartPayload.fromVersionedJson, + ).value, + 'child', + ); + expect( + wait.resultVersionedJson<_WorkflowResultPayload>( + version: 2, + decode: _WorkflowResultPayload.fromVersionedJson, + )?.value, + 'done', + ); + expect( + emit.payloadVersionedJson<_WorkflowEventPayload>( + version: 2, + decode: _WorkflowEventPayload.fromVersionedJson, + ).value, + 'event', + ); + }); + test('TaskInvocationContext.remote sends control signals', () async { final control = ReceivePort(); addTearDown(control.close); @@ -165,6 +603,92 @@ void main() { expect(signals.any((signal) => signal is EnqueueTaskSignal), isTrue); }); + test( + 'TaskInvocationContext.remote proxies workflow start and wait', + () async { + final control = ReceivePort(); + addTearDown(control.close); + + control.listen((message) { + if (message is StartWorkflowSignal) { + message.replyPort.send( + const StartWorkflowResponse(runId: 'remote-run'), + ); + } else if (message is WaitForWorkflowSignal) { + message.replyPort.send( + WaitForWorkflowResponse( + result: WorkflowResult( + runId: message.request.runId, + status: WorkflowStatus.completed, + state: RunState( + id: message.request.runId, + workflow: message.request.workflowName, + status: WorkflowStatus.completed, + cursor: 0, + params: const {}, + createdAt: DateTime.utc(2026), + result: 'workflow-result', + ), + value: 'workflow-result', + rawResult: 'workflow-result', + ).toJson(), + ), + ); + } + }); + + final context = TaskInvocationContext.remote( + id: 'remote-task', + controlPort: control.sendPort, + headers: const {}, + meta: const {}, + attempt: 0, + ); + final definition = WorkflowRef, String>( + name: 'workflow.child', + encodeParams: (params) => params, + ); + + final runId = await context.startWorkflowRef( + definition, + const {'value': 'child'}, + ); + final result = await context.waitForWorkflowRef(runId, definition); + + expect(runId, 'remote-run'); + expect(result?.value, 'workflow-result'); + }, + ); + + test( + 'TaskInvocationContext.remote proxies workflow event emission', + () async { + final control = ReceivePort(); + addTearDown(control.close); + + EmitWorkflowEventRequest? request; + control.listen((message) { + if (message is EmitWorkflowEventSignal) { + request = message.request; + message.replyPort.send(const EmitWorkflowEventResponse()); + } + }); + + final context = TaskInvocationContext.remote( + id: 'remote-task', + controlPort: control.sendPort, + headers: const {}, + meta: const {}, + attempt: 0, + ); + + await context.emitEvent(_eventRef, const _WorkflowEventPayload('remote')); + + expect(request?.topic, 'workflow.ready'); + expect(request?.payload, {'value': 'remote'}); + }, + ); + test('TaskInvocationContext.remote surfaces enqueue errors', () async { final control = ReceivePort(); addTearDown(control.close); @@ -189,8 +713,64 @@ void main() { ); }); + test('TaskInvocationContext.remote surfaces workflow errors', () async { + final control = ReceivePort(); + addTearDown(control.close); + + control.listen((message) { + if (message is StartWorkflowSignal) { + message.replyPort.send( + const StartWorkflowResponse(error: 'workflow nope'), + ); + } + }); + + final context = TaskInvocationContext.remote( + id: 'remote-task', + controlPort: control.sendPort, + headers: const {}, + meta: const {}, + attempt: 0, + ); + final definition = WorkflowRef, String>( + name: 'workflow.child', + encodeParams: (params) => params, + ); + + await expectLater( + () => context.startWorkflowRef(definition, const {'value': 'child'}), + throwsA(isA()), + ); + }); + + test('TaskInvocationContext.remote surfaces workflow event errors', () async { + final control = ReceivePort(); + addTearDown(control.close); + + control.listen((message) { + if (message is EmitWorkflowEventSignal) { + message.replyPort.send( + const EmitWorkflowEventResponse(error: 'event nope'), + ); + } + }); + + final context = TaskInvocationContext.remote( + id: 'remote-task', + controlPort: control.sendPort, + headers: const {}, + meta: const {}, + attempt: 0, + ); + + await expectLater( + () => context.emitEvent(_eventRef, const _WorkflowEventPayload('oops')), + throwsA(isA()), + ); + }); + test('TaskInvocationContext.retry throws TaskRetryRequest', () { - final context = TaskInvocationContext.local( + final TaskExecutionContext context = TaskInvocationContext.local( id: 'retry-task', headers: const {}, meta: const {}, @@ -208,4 +788,106 @@ void main() { }); } +class _WorkflowEventPayload { + const _WorkflowEventPayload(this.value); + + factory _WorkflowEventPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WorkflowEventPayload(json['value'] as String); + } + + final String value; +} + +class _QueueLabel { + const _QueueLabel(this.label); + + factory _QueueLabel.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _QueueLabel(json['label'] as String); + } + + final String label; +} + +class _WorkflowStartPayload { + const _WorkflowStartPayload(this.value); + + factory _WorkflowStartPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WorkflowStartPayload(json['value'] as String); + } + + final String value; +} + +class _WorkflowResultPayload { + const _WorkflowResultPayload(this.value); + + factory _WorkflowResultPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WorkflowResultPayload(json['value'] as String); + } + + final String value; +} + +class _ProgressUpdate { + const _ProgressUpdate({required this.stage}); + + factory _ProgressUpdate.fromJson(Map json) { + return _ProgressUpdate(stage: json['stage'] as String); + } + + factory _ProgressUpdate.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ProgressUpdate(stage: json['stage'] as String); + } + + final String stage; + + Map toJson() => {'stage': stage}; +} + +const _progressUpdateCodec = PayloadCodec<_ProgressUpdate>.json( + decode: _ProgressUpdate.fromJson, +); + +const PayloadCodec<_WorkflowEventPayload> _eventPayloadCodec = + PayloadCodec<_WorkflowEventPayload>( + encode: _encodeWorkflowEventPayload, + decode: _decodeWorkflowEventPayload, + ); + +const WorkflowEventRef<_WorkflowEventPayload> _eventRef = + WorkflowEventRef<_WorkflowEventPayload>( + topic: 'workflow.ready', + codec: _eventPayloadCodec, + ); + +Map _encodeWorkflowEventPayload(_WorkflowEventPayload value) { + return {'value': value.value}; +} + +_WorkflowEventPayload _decodeWorkflowEventPayload(Object? payload) { + return _WorkflowEventPayload( + (payload! as Map)['value']! as String, + ); +} + Map _encodeArgs(Map args) => args; diff --git a/packages/stem/test/unit/core/task_registry_test.dart b/packages/stem/test/unit/core/task_registry_test.dart index 28fd54e6..bd63b56c 100644 --- a/packages/stem/test/unit/core/task_registry_test.dart +++ b/packages/stem/test/unit/core/task_registry_test.dart @@ -192,17 +192,6 @@ void main() { expect(handler.metadata.description, 'Example task'); }); - test('retains SimpleTaskRegistry as a compatibility alias', () { - // Compatibility coverage intentionally exercises the deprecated symbol. - // ignore: deprecated_member_use_from_same_package - final registry = SimpleTaskRegistry(); - // A single plain call is clearer here than forcing a one-off cascade. - // ignore: cascade_invocations - registry.register(_TestHandler('legacy.task')); - - expect(registry, isA()); - expect(registry.resolve('legacy.task')?.name, 'legacy.task'); - }); }); group('TaskDefinition', () { @@ -212,7 +201,7 @@ void main() { encodeArgs: (args) => {'value': args.value}, ); - final call = definition(_Args(42)); + final call = definition.buildCall(_Args(42)); expect(call.name, 'demo.task'); expect(call.encodeArgs(), {'value': 42}); expect(call.resolveOptions(), const TaskOptions()); @@ -230,7 +219,7 @@ void main() { encodeArgs: (args) => {'value': args.value}, ); - final call = definition( + final call = definition.buildCall( _Args(99), headers: {'x-id': 'abc'}, options: const TaskOptions(queue: 'custom'), @@ -248,25 +237,20 @@ void main() { }); }); - group('TaskEnqueueBuilder', () { - test('builds TaskCall with overrides', () { + group('TaskCall', () { + test('buildCall builds TaskCall with overrides', () { final definition = TaskDefinition<_Args, void>( name: 'demo.task', encodeArgs: (args) => {'value': args.value}, ); - final builder = - TaskEnqueueBuilder<_Args, void>( - definition: definition, - args: _Args(7), - ) - ..header('x-id', 'abc') - ..meta('source', 'test') - ..priority(5) - ..queue('fast') - ..delay(const Duration(seconds: 1)); - - final call = builder.build(); + final call = definition.buildCall( + _Args(7), + headers: const {'x-id': 'abc'}, + meta: const {'source': 'test'}, + options: const TaskOptions(priority: 5, queue: 'fast'), + notBefore: stemNow().add(const Duration(seconds: 1)), + ); expect(call.headers['x-id'], 'abc'); expect(call.meta['source'], 'test'); expect(call.resolveOptions().priority, 5); diff --git a/packages/stem/test/unit/core/task_result_test.dart b/packages/stem/test/unit/core/task_result_test.dart index 6dafa8f5..d128d14d 100644 --- a/packages/stem/test/unit/core/task_result_test.dart +++ b/packages/stem/test/unit/core/task_result_test.dart @@ -1,4 +1,5 @@ import 'package:stem/src/core/contracts.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/core/task_result.dart'; import 'package:test/test.dart'; @@ -37,4 +38,98 @@ void main() { expect(cancelled.isCancelled, isTrue); }); + + test('TaskResult exposes typed value helpers', () { + final result = TaskResult( + taskId: 'task-1', + status: TaskStatus( + id: 'task-1', + state: TaskState.succeeded, + attempt: 0, + payload: 42, + ), + value: 42, + rawPayload: 42, + ); + + expect(result.valueOr(7), 42); + expect(result.requiredValue(), 42); + }); + + test('TaskResult exposes raw payload decode helpers', () { + final codec = PayloadCodec>.map( + encode: (value) => value, + decode: (json) => json, + typeName: 'ReceiptMap', + ); + final result = TaskResult( + taskId: 'task-1', + status: TaskStatus( + id: 'task-1', + state: TaskState.succeeded, + attempt: 0, + payload: const {'id': 'receipt-1'}, + ), + rawPayload: const {'id': 'receipt-1'}, + ); + + expect( + result.payloadAs>(codec: codec), + equals(const {'id': 'receipt-1'}), + ); + expect( + result.payloadJson<_TaskReceipt>( + decode: _TaskReceipt.fromJson, + ), + isA<_TaskReceipt>().having((value) => value.id, 'id', 'receipt-1'), + ); + expect( + result.payloadVersionedJson<_TaskReceipt>( + version: 2, + decode: _TaskReceipt.fromVersionedJson, + ), + isA<_TaskReceipt>().having((value) => value.id, 'id', 'receipt-1'), + ); + }); + + test('TaskResult.requiredValue throws when value is absent', () { + final result = TaskResult( + taskId: 'task-1', + status: TaskStatus( + id: 'task-1', + state: TaskState.failed, + attempt: 1, + ), + ); + + expect( + result.requiredValue, + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('task-1'), + ), + ), + ); + expect(result.valueOr(7), 7); + }); +} + +class _TaskReceipt { + const _TaskReceipt({required this.id}); + + factory _TaskReceipt.fromJson(Map json) { + return _TaskReceipt(id: json['id'] as String); + } + + factory _TaskReceipt.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _TaskReceipt(id: json['id'] as String); + } + + final String id; } diff --git a/packages/stem/test/unit/observability/heartbeat_test.dart b/packages/stem/test/unit/observability/heartbeat_test.dart index fee8c8a2..c267f0a2 100644 --- a/packages/stem/test/unit/observability/heartbeat_test.dart +++ b/packages/stem/test/unit/observability/heartbeat_test.dart @@ -30,4 +30,65 @@ void main() { expect(decoded.queues.first.inflight, equals(2)); expect(decoded.extras['host'], equals('app-01')); }); + + test('worker heartbeat exposes typed extras helpers', () { + final heartbeat = WorkerHeartbeat( + workerId: 'worker-1', + timestamp: DateTime.utc(2025), + isolateCount: 3, + inflight: 2, + queues: [QueueHeartbeat(name: 'default', inflight: 2)], + extras: const { + PayloadCodec.versionKey: 2, + 'env': 'test', + 'region': 'us-east-1', + }, + ); + + expect(heartbeat.extraValue('env'), 'test'); + expect( + heartbeat.extraValueOr('missing', 'fallback'), + 'fallback', + ); + expect(heartbeat.requiredExtraValue('region'), 'us-east-1'); + expect( + heartbeat.extrasJson<_HeartbeatExtras>( + decode: _HeartbeatExtras.fromJson, + ), + isA<_HeartbeatExtras>() + .having((value) => value.env, 'env', 'test') + .having((value) => value.region, 'region', 'us-east-1'), + ); + expect( + heartbeat.extrasVersionedJson<_HeartbeatExtras>( + version: 2, + decode: _HeartbeatExtras.fromVersionedJson, + ), + isA<_HeartbeatExtras>() + .having((value) => value.env, 'env', 'test') + .having((value) => value.region, 'region', 'us-east-1'), + ); + }); +} + +class _HeartbeatExtras { + const _HeartbeatExtras({required this.env, required this.region}); + + factory _HeartbeatExtras.fromJson(Map json) { + return _HeartbeatExtras( + env: json['env'] as String, + region: json['region'] as String, + ); + } + + factory _HeartbeatExtras.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _HeartbeatExtras.fromJson(json); + } + + final String env; + final String region; } diff --git a/packages/stem/test/unit/observability/logging_test.dart b/packages/stem/test/unit/observability/logging_test.dart index da513331..0d4c319b 100644 --- a/packages/stem/test/unit/observability/logging_test.dart +++ b/packages/stem/test/unit/observability/logging_test.dart @@ -1,3 +1,12 @@ +import 'package:ansicolor/ansicolor.dart' show ansiColorDisabled; +import 'package:contextual/contextual.dart' + show + LogDriver, + LogEntry, + LogRecord, + LoggerChannelSelection, + PlainTextLogFormatter, + PrettyLogFormatter; import 'package:stem/stem.dart'; import 'package:test/test.dart'; @@ -21,6 +30,74 @@ void main() { configureStemLogging(level: Level.warning); }); + test('createStemLogger defaults to a silent logger', () async { + final logger = createStemLogger()..info('default silent mode'); + + final driver = _RecordingLogDriver(); + logger.addChannel('recording', driver); + logger.channel('recording').info('explicit recording mode'); + await logger.shutdown(); + + expect(driver.entries, hasLength(1)); + }); + + test('createStemLogFormatter returns the pretty formatter', () { + final originalAnsiSetting = ansiColorDisabled; + addTearDown(() => ansiColorDisabled = originalAnsiSetting); + ansiColorDisabled = false; + + final formatter = createStemLogFormatter(StemLogFormat.pretty); + final output = formatter.format( + LogRecord( + time: DateTime.utc(2026, 3, 21, 12), + level: Level.info, + message: 'hello', + ), + ); + + expect(output, contains('\x1B[38;5;12m')); + expect(output, isNot(contains('\x1B[38;5;255m'))); + expect(formatter, isNot(isA())); + }); + + test( + 'configureStemLogging can switch the shared logger to pretty mode', + () async { + final original = stemLogger; + addTearDown(() => setStemLogger(original)); + final replacement = createStemLogger(); + final driver = _RecordingLogDriver(); + replacement.addChannel('recording', driver); + setStemLogger(replacement); + + configureStemLogging(format: StemLogFormat.pretty); + stemLogger.channel('recording').info('pretty shared mode'); + await stemLogger.shutdown(); + + expect(driver.entries, hasLength(1)); + expect( + createStemLogFormatter(StemLogFormat.pretty), + isNot(isA()), + ); + }, + ); + + test('configureStemLogging can keep the shared logger silent', () { + final original = stemLogger; + addTearDown(() => setStemLogger(original)); + final replacement = createStemLogger(enableConsole: true); + setStemLogger(replacement); + + configureStemLogging(enableConsole: false); + }); + + test('createStemLogFormatter returns the plain formatter', () { + expect( + createStemLogFormatter(StemLogFormat.plain), + isA(), + ); + }); + test('setStemLogger replaces the shared logger', () { final original = stemLogger; addTearDown(() => setStemLogger(original)); @@ -42,3 +119,14 @@ void main() { expect(context['queue'], 'default'); }); } + +class _RecordingLogDriver extends LogDriver { + _RecordingLogDriver() : entries = [], super('recording'); + + final List entries; + + @override + Future log(LogEntry entry) async { + entries.add(entry); + } +} diff --git a/packages/stem/test/unit/signals/payloads_test.dart b/packages/stem/test/unit/signals/payloads_test.dart index 5d55b599..66432480 100644 --- a/packages/stem/test/unit/signals/payloads_test.dart +++ b/packages/stem/test/unit/signals/payloads_test.dart @@ -2,6 +2,7 @@ import 'package:stem/src/control/control_messages.dart'; import 'package:stem/src/core/clock.dart'; import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/envelope.dart'; +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/signals/payloads.dart'; import 'package:test/test.dart'; @@ -51,6 +52,19 @@ void main() { expect(postrun.taskId, equals('task-1')); expect(postrun.taskName, equals('demo.task')); expect(postrun.attempt, equals(2)); + expect( + postrun.resultJson<_TaskResultPayload>( + decode: _TaskResultPayload.fromJson, + ), + isA<_TaskResultPayload>().having((value) => value.ok, 'ok', isTrue), + ); + expect( + postrun.resultVersionedJson<_TaskResultPayload>( + version: 2, + decode: _TaskResultPayload.fromVersionedJson, + ), + isA<_TaskResultPayload>().having((value) => value.ok, 'ok', isTrue), + ); final retry = TaskRetryPayload( envelope: envelope, @@ -67,6 +81,25 @@ void main() { retry.attributes['nextRetryAt'], equals(DateTime.utc(2025).toIso8601String()), ); + + final success = TaskSuccessPayload( + envelope: envelope, + worker: worker, + result: const {'ok': true}, + ); + expect( + success.resultJson<_TaskResultPayload>( + decode: _TaskResultPayload.fromJson, + ), + isA<_TaskResultPayload>().having((value) => value.ok, 'ok', isTrue), + ); + expect( + success.resultVersionedJson<_TaskResultPayload>( + version: 2, + decode: _TaskResultPayload.fromVersionedJson, + ), + isA<_TaskResultPayload>().having((value) => value.ok, 'ok', isTrue), + ); }); test('control command payload timestamps are frozen at creation', () { @@ -99,4 +132,230 @@ void main() { expect(completed.occurredAt, DateTime.utc(2025, 1, 1, 0, 1)); }); }); + + test('workflow run payload exposes typed metadata helpers', () { + final payload = WorkflowRunPayload( + runId: 'run-1', + workflow: 'demo.workflow', + status: WorkflowRunStatus.suspended, + metadata: const { + 'attempt': 3, + 'approval': {'approved': true}, + }, + ); + + expect(payload.metadataValue('attempt'), 3); + expect(payload.metadataValueOr('missing', 'fallback'), 'fallback'); + expect(payload.requiredMetadataValue('attempt'), 3); + expect( + payload.metadataPayloadJson<_WorkflowRunEnvelope>( + decode: _WorkflowRunEnvelope.fromJson, + ), + isA<_WorkflowRunEnvelope>() + .having((value) => value.attempt, 'attempt', 3) + .having((value) => value.approved, 'approved', isTrue), + ); + expect( + payload.metadataPayloadVersionedJson<_WorkflowRunEnvelope>( + version: 2, + decode: _WorkflowRunEnvelope.fromVersionedJson, + ), + isA<_WorkflowRunEnvelope>() + .having((value) => value.attempt, 'attempt', 3) + .having((value) => value.approved, 'approved', isTrue), + ); + expect( + payload.metadataJson<_WorkflowRunMetadata>( + 'approval', + decode: _WorkflowRunMetadata.fromJson, + ), + isA<_WorkflowRunMetadata>().having( + (value) => value.approved, + 'approved', + isTrue, + ), + ); + expect( + payload.metadataVersionedJson<_WorkflowRunMetadata>( + 'approval', + version: 2, + decode: _WorkflowRunMetadata.fromVersionedJson, + ), + isA<_WorkflowRunMetadata>().having( + (value) => value.approved, + 'approved', + isTrue, + ), + ); + }); + + test('control command payload exposes typed response and error helpers', () { + const worker = WorkerInfo( + id: 'worker-1', + queues: ['default'], + broadcasts: [], + ); + final command = ControlCommandMessage( + requestId: 'req-2', + type: 'pause', + targets: const ['*'], + ); + final payload = ControlCommandCompletedPayload( + worker: worker, + command: command, + status: 'error', + response: const { + PayloadCodec.versionKey: 2, + 'queue': 'priority', + 'paused': true, + }, + error: const { + PayloadCodec.versionKey: 2, + 'code': 'pause_failed', + 'message': 'already paused', + }, + ); + + expect(payload.responseValue('queue'), 'priority'); + expect(payload.responseValueOr('missing', 'fallback'), 'fallback'); + expect(payload.requiredResponseValue('paused'), isTrue); + expect( + payload.responseJson<_ControlResponse>(decode: _ControlResponse.fromJson), + isA<_ControlResponse>() + .having((value) => value.queue, 'queue', 'priority') + .having((value) => value.paused, 'paused', isTrue), + ); + expect( + payload.responseVersionedJson<_ControlResponse>( + version: 2, + decode: _ControlResponse.fromVersionedJson, + ), + isA<_ControlResponse>() + .having((value) => value.queue, 'queue', 'priority') + .having((value) => value.paused, 'paused', isTrue), + ); + expect(payload.errorValue('code'), 'pause_failed'); + expect(payload.errorValueOr('missing', 'fallback'), 'fallback'); + expect(payload.requiredErrorValue('message'), 'already paused'); + expect( + payload.errorJson<_ControlError>(decode: _ControlError.fromJson), + isA<_ControlError>() + .having((value) => value.code, 'code', 'pause_failed') + .having((value) => value.message, 'message', 'already paused'), + ); + expect( + payload.errorVersionedJson<_ControlError>( + version: 2, + decode: _ControlError.fromVersionedJson, + ), + isA<_ControlError>() + .having((value) => value.code, 'code', 'pause_failed') + .having((value) => value.message, 'message', 'already paused'), + ); + }); +} + +class _TaskResultPayload { + const _TaskResultPayload({required this.ok}); + + factory _TaskResultPayload.fromJson(Map json) { + return _TaskResultPayload(ok: json['ok'] as bool); + } + + factory _TaskResultPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _TaskResultPayload(ok: json['ok'] as bool); + } + + final bool ok; +} + +class _WorkflowRunMetadata { + const _WorkflowRunMetadata({required this.approved}); + + factory _WorkflowRunMetadata.fromJson(Map json) { + return _WorkflowRunMetadata(approved: json['approved'] as bool); + } + + factory _WorkflowRunMetadata.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WorkflowRunMetadata(approved: json['approved'] as bool); + } + + final bool approved; +} + +class _WorkflowRunEnvelope { + const _WorkflowRunEnvelope({required this.attempt, required this.approved}); + + factory _WorkflowRunEnvelope.fromJson(Map json) { + final approval = Map.from( + json['approval']! as Map, + ); + return _WorkflowRunEnvelope( + attempt: json['attempt'] as int, + approved: approval['approved'] as bool, + ); + } + + factory _WorkflowRunEnvelope.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WorkflowRunEnvelope.fromJson(json); + } + + final int attempt; + final bool approved; +} + +class _ControlResponse { + const _ControlResponse({required this.queue, required this.paused}); + + factory _ControlResponse.fromJson(Map json) { + return _ControlResponse( + queue: json['queue'] as String, + paused: json['paused'] as bool, + ); + } + + factory _ControlResponse.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ControlResponse.fromJson(json); + } + + final String queue; + final bool paused; +} + +class _ControlError { + const _ControlError({required this.code, required this.message}); + + factory _ControlError.fromJson(Map json) { + return _ControlError( + code: json['code'] as String, + message: json['message'] as String, + ); + } + + factory _ControlError.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ControlError.fromJson(json); + } + + final String code; + final String message; } diff --git a/packages/stem/test/unit/worker/task_context_enqueue_integration_test.dart b/packages/stem/test/unit/worker/task_context_enqueue_integration_test.dart index 4e442ba6..e8b2f532 100644 --- a/packages/stem/test/unit/worker/task_context_enqueue_integration_test.dart +++ b/packages/stem/test/unit/worker/task_context_enqueue_integration_test.dart @@ -13,6 +13,45 @@ final _childDefinition = TaskDefinition<_ChildArgs, void>( encodeArgs: (args) => {'value': args.value}, ); +final Flow _childWorkflow = Flow( + name: 'tasks.child.workflow', + build: (flow) { + flow.step('complete', (context) async => 'workflow-child'); + }, +); + +final NoArgsWorkflowRef _childWorkflowRef = _childWorkflow.ref0(); + +const PayloadCodec<_WorkflowEventPayload> _workflowEventPayloadCodec = + PayloadCodec<_WorkflowEventPayload>( + encode: _encodeWorkflowEventPayload, + decode: _decodeWorkflowEventPayload, + ); + +const WorkflowEventRef<_WorkflowEventPayload> _workflowEventRef = + WorkflowEventRef<_WorkflowEventPayload>( + topic: 'tasks.workflow.ready', + codec: _workflowEventPayloadCodec, + ); + +final Flow _waitingWorkflow = Flow( + name: 'tasks.waiting.workflow', + build: (flow) { + flow.step('wait-for-event', (context) async { + final event = context.waitForEventValue<_WorkflowEventPayload>( + _workflowEventRef.topic, + codec: _workflowEventPayloadCodec, + ); + if (event == null) { + return null; + } + return event.value; + }); + }, +); + +final NoArgsWorkflowRef _waitingWorkflowRef = _waitingWorkflow.ref0(); + void main() { group('TaskInvocationContext enqueue', () { test('enqueues from isolate entrypoint using builder', () async { @@ -56,6 +95,49 @@ void main() { await worker.shutdown(); broker.dispose(); }); + + test('starts child workflows from isolate entrypoints', () async { + final app = await StemWorkflowApp.inMemory( + tasks: [_IsolateStartWorkflowTask()], + flows: [_childWorkflow], + ); + + final taskId = await app.enqueue('tasks.isolate.start.workflow'); + final result = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 3), + ); + + expect(result?.isSucceeded, isTrue); + expect(result?.value, equals('workflow-child')); + + await app.close(); + }); + + test('emits workflow events from isolate entrypoints', () async { + final app = await StemWorkflowApp.inMemory( + tasks: [_IsolateEmitWorkflowEventTask()], + flows: [_waitingWorkflow], + ); + + final runId = await _waitingWorkflowRef.start(app); + final taskId = await app.enqueue('tasks.isolate.emit.workflow.event'); + + final taskResult = await app.waitForTask( + taskId, + timeout: const Duration(seconds: 3), + ); + final workflowResult = await _waitingWorkflowRef.waitFor( + app, + runId, + timeout: const Duration(seconds: 3), + ); + + expect(taskResult?.isSucceeded, isTrue); + expect(workflowResult?.value, equals('workflow-ready')); + + await app.close(); + }); }); test('enqueue + execute round-trip is stable', () async { @@ -139,7 +221,9 @@ void main() { await stem.enqueue( 'tasks.primary.success', enqueueOptions: TaskEnqueueOptions( - link: [linkDefinition(const _ChildArgs('linked'))], + link: [ + linkDefinition.buildCall(const _ChildArgs('linked')), + ], ), ); @@ -196,7 +280,9 @@ void main() { await stem.enqueue( 'tasks.primary.fail', enqueueOptions: TaskEnqueueOptions( - linkError: [linkDefinition(const _ChildArgs('linked'))], + linkError: [ + linkDefinition.buildCall(const _ChildArgs('linked')), + ], ), ); @@ -426,6 +512,12 @@ class _ChildArgs { final String value; } +class _WorkflowEventPayload { + const _WorkflowEventPayload(this.value); + + final String value; +} + class _IsolateEnqueueTask implements TaskHandler { @override String get name => 'tasks.isolate.enqueue'; @@ -447,10 +539,77 @@ FutureOr _isolateEnqueueEntrypoint( TaskInvocationContext context, Map args, ) async { - final builder = context.enqueueBuilder( - definition: _childDefinition, - args: const _ChildArgs('from-isolate'), + final call = _childDefinition.buildCall( + const _ChildArgs('from-isolate'), ); - await builder.enqueueWith(context); + await context.enqueueCall(call); return null; } + +class _IsolateStartWorkflowTask implements TaskHandler { + @override + String get name => 'tasks.isolate.start.workflow'; + + @override + TaskOptions get options => const TaskOptions(); + + @override + TaskMetadata get metadata => const TaskMetadata(); + + @override + TaskEntrypoint? get isolateEntrypoint => _isolateStartWorkflowEntrypoint; + + @override + Future call(TaskContext context, Map args) async { + return ''; + } +} + +FutureOr _isolateStartWorkflowEntrypoint( + TaskInvocationContext context, + Map args, +) async { + final result = await _childWorkflowRef.startAndWait( + context, + timeout: const Duration(seconds: 2), + ); + return result?.value; +} + +class _IsolateEmitWorkflowEventTask implements TaskHandler { + @override + String get name => 'tasks.isolate.emit.workflow.event'; + + @override + TaskOptions get options => const TaskOptions(); + + @override + TaskMetadata get metadata => const TaskMetadata(); + + @override + TaskEntrypoint? get isolateEntrypoint => _isolateEmitWorkflowEventEntrypoint; + + @override + Future call(TaskContext context, Map args) async {} +} + +FutureOr _isolateEmitWorkflowEventEntrypoint( + TaskInvocationContext context, + Map args, +) async { + await context.emitEvent( + _workflowEventRef, + const _WorkflowEventPayload('workflow-ready'), + ); + return null; +} + +Map _encodeWorkflowEventPayload(_WorkflowEventPayload value) { + return {'value': value.value}; +} + +_WorkflowEventPayload _decodeWorkflowEventPayload(Object? payload) { + return _WorkflowEventPayload( + (payload! as Map)['value']! as String, + ); +} diff --git a/packages/stem/test/unit/workflow/flow_context_test.dart b/packages/stem/test/unit/workflow/flow_context_test.dart index cbee779e..af2379a8 100644 --- a/packages/stem/test/unit/workflow/flow_context_test.dart +++ b/packages/stem/test/unit/workflow/flow_context_test.dart @@ -1,6 +1,10 @@ +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/flow_context.dart'; import 'package:stem/src/workflow/core/flow_step.dart'; import 'package:stem/src/workflow/core/workflow_clock.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/stem.dart' + show TaskCall, TaskEnqueueOptions, TaskEnqueuer, TaskOptions; import 'package:test/test.dart'; void main() { @@ -48,6 +52,50 @@ void main() { expect(second, isNull); }); + test('FlowContext JSON suspension helpers encode DTO payloads', () { + final context = FlowContext( + workflow: 'demo', + runId: 'run-2b', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 1, + ); + + final sleep = context.sleepJson( + const Duration(seconds: 3), + const _SuspensionPayload(stage: 'sleeping'), + ); + final wait = context.awaitEventJson( + 'topic', + const _SuspensionPayload(stage: 'waiting'), + deadline: DateTime.parse('2025-01-01T00:00:00Z'), + ); + final versionedSleep = context.sleepVersionedJson( + const Duration(seconds: 4), + const _SuspensionPayload(stage: 'versioned-sleep'), + version: 2, + ); + final versionedWait = context.awaitEventVersionedJson( + 'topic.versioned', + const _SuspensionPayload(stage: 'versioned-wait'), + version: 2, + deadline: DateTime.parse('2025-01-01T00:00:01Z'), + ); + + expect(sleep.data, equals(const {'stage': 'sleeping'})); + expect(wait.data, equals(const {'stage': 'waiting'})); + expect(wait.deadline, DateTime.parse('2025-01-01T00:00:00Z')); + expect(versionedSleep.data, { + PayloadCodec.versionKey: 2, + 'stage': 'versioned-sleep', + }); + expect(versionedWait.data, { + PayloadCodec.versionKey: 2, + 'stage': 'versioned-wait', + }); + }); + test( 'FlowContext resume data is consumed and idempotency key derives scope', () { @@ -72,4 +120,174 @@ void main() { expect(context.idempotencyKey('custom'), 'demo/run-3/custom'); }, ); + + test('startWith throws when workflow caller support is unavailable', () { + final context = FlowContext( + workflow: 'demo', + runId: 'run-4', + stepName: 'spawn', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + final childRef = WorkflowRef, String>( + name: 'child.flow', + encodeParams: (params) => params, + ); + + expect( + () => childRef.start(context, params: const {'value': 'x'}), + throwsStateError, + ); + }, + ); + + test( + 'startAndWaitWith throws when workflow caller support is unavailable', + () { + final context = FlowContext( + workflow: 'demo', + runId: 'run-5', + stepName: 'spawn', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + final childRef = WorkflowRef, String>( + name: 'child.flow', + encodeParams: (params) => params, + ); + + expect( + () => childRef.startAndWait( + context, + params: const {'value': 'x'}, + ), + throwsStateError, + ); + }, + ); + + test('FlowContext.enqueue delegates to the configured enqueuer', () async { + final enqueuer = _RecordingEnqueuer(); + final context = FlowContext( + workflow: 'demo', + runId: 'run-6', + stepName: 'dispatch', + params: const {}, + previousResult: null, + stepIndex: 0, + enqueuer: enqueuer, + ); + + final taskId = await context.enqueue( + 'tasks.child', + args: const {'value': 42}, + meta: const {'source': 'flow'}, + ); + + expect(taskId, equals('recorded-1')); + expect(enqueuer.lastName, equals('tasks.child')); + expect(enqueuer.lastArgs, equals({'value': 42})); + expect(enqueuer.lastMeta, containsPair('source', 'flow')); + }); + + test('FlowContext.enqueue throws when no enqueuer is configured', () { + final context = FlowContext( + workflow: 'demo', + runId: 'run-7', + stepName: 'dispatch', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + expect(() => context.enqueue('tasks.child'), throwsStateError); + }); +} + +class _SuspensionPayload { + const _SuspensionPayload({required this.stage}); + + final String stage; + + Map toJson() => {'stage': stage}; +} + +class _RecordingEnqueuer implements TaskEnqueuer { + String? lastName; + Map? lastArgs; + Map? lastMeta; + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) async { + lastName = name; + lastArgs = Map.from(args); + lastMeta = Map.from(meta); + return 'recorded-1'; + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + call.name, + args: call.encodeArgs(), + headers: call.headers, + meta: call.meta, + options: call.resolveOptions(), + notBefore: call.notBefore, + enqueueOptions: enqueueOptions ?? call.enqueueOptions, + ); + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeFlowTaskArgs(name, value, codec: codec), + headers: headers, + meta: meta, + options: options, + notBefore: notBefore, + enqueueOptions: enqueueOptions, + ); + } +} + +Map _encodeFlowTaskArgs( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + return payload.map((key, value) => MapEntry(key.toString(), value)); + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); } diff --git a/packages/stem/test/unit/workflow/flow_step_test.dart b/packages/stem/test/unit/workflow/flow_step_test.dart index 2952f39a..729d0f38 100644 --- a/packages/stem/test/unit/workflow/flow_step_test.dart +++ b/packages/stem/test/unit/workflow/flow_step_test.dart @@ -1,3 +1,4 @@ +import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/flow_step.dart'; import 'package:test/test.dart'; @@ -23,4 +24,78 @@ void main() { final cont = FlowStepControl.continueRun(); expect(cont.type, FlowControlType.continueRun); }); + + test('FlowStepControl JSON factories encode DTO payloads', () { + final sleep = FlowStepControl.sleepJson( + const Duration(seconds: 5), + const _SuspensionPayload(stage: 'sleeping'), + ); + final wait = FlowStepControl.awaitTopicJson( + 'topic', + const _SuspensionPayload(stage: 'waiting'), + deadline: DateTime.parse('2025-01-01T00:00:00Z'), + ); + final versionedSleep = FlowStepControl.sleepVersionedJson( + const Duration(seconds: 6), + const _SuspensionPayload(stage: 'versioned-sleep'), + version: 2, + ); + final versionedWait = FlowStepControl.awaitTopicVersionedJson( + 'versioned-topic', + const _SuspensionPayload(stage: 'versioned-wait'), + version: 2, + deadline: DateTime.parse('2025-01-01T00:00:01Z'), + ); + + expect(sleep.data, equals(const {'stage': 'sleeping'})); + expect( + sleep.dataJson<_SuspensionPayload>(decode: _SuspensionPayload.fromJson), + isA<_SuspensionPayload>().having( + (value) => value.stage, + 'stage', + 'sleeping', + ), + ); + expect( + sleep.dataVersionedJson<_SuspensionPayload>( + version: 2, + decode: _SuspensionPayload.fromVersionedJson, + ), + isA<_SuspensionPayload>().having( + (value) => value.stage, + 'stage', + 'sleeping', + ), + ); + expect(wait.data, equals(const {'stage': 'waiting'})); + expect(wait.deadline, DateTime.parse('2025-01-01T00:00:00Z')); + expect(versionedSleep.data, { + PayloadCodec.versionKey: 2, + 'stage': 'versioned-sleep', + }); + expect(versionedWait.data, { + PayloadCodec.versionKey: 2, + 'stage': 'versioned-wait', + }); + }); +} + +class _SuspensionPayload { + const _SuspensionPayload({required this.stage}); + + factory _SuspensionPayload.fromJson(Map json) { + return _SuspensionPayload(stage: json['stage'] as String); + } + + factory _SuspensionPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _SuspensionPayload(stage: json['stage'] as String); + } + + final String stage; + + Map toJson() => {'stage': stage}; } diff --git a/packages/stem/test/unit/workflow/in_memory_event_bus_test.dart b/packages/stem/test/unit/workflow/in_memory_event_bus_test.dart index cbcaed1c..d3a4b8a4 100644 --- a/packages/stem/test/unit/workflow/in_memory_event_bus_test.dart +++ b/packages/stem/test/unit/workflow/in_memory_event_bus_test.dart @@ -14,9 +14,9 @@ class _NoopWorkflowStore implements WorkflowStore { @override Future createRun({ - String? runId, required String workflow, required Map params, + String? runId, String? parentRunId, Duration? ttl, WorkflowCancellationPolicy? cancellationPolicy, diff --git a/packages/stem/test/unit/workflow/workflow_manifest_test.dart b/packages/stem/test/unit/workflow/workflow_manifest_test.dart index e1559327..d7a7ecd2 100644 --- a/packages/stem/test/unit/workflow/workflow_manifest_test.dart +++ b/packages/stem/test/unit/workflow/workflow_manifest_test.dart @@ -21,15 +21,10 @@ void main() { expect(manifest.id, equals(firstId)); expect(manifest.name, equals('manifest.flow')); expect(manifest.kind, equals(WorkflowDefinitionKind.flow)); - expect(manifest.stepCollectionLabel, equals('steps')); - expect(manifest.checkpoints, hasLength(2)); expect(manifest.steps, hasLength(2)); + expect(manifest.checkpoints, isEmpty); expect(manifest.steps.first.position, equals(0)); expect(manifest.steps.first.name, equals('first')); - expect( - manifest.steps.first.role, - equals(WorkflowManifestStepRole.flowStep), - ); expect(manifest.steps.first.id, isNotEmpty); expect(manifest.steps.first.id, isNot(equals(manifest.steps.last.id))); }); @@ -38,41 +33,38 @@ void main() { final definition = WorkflowScript>( name: 'manifest.script', run: (script) async { - final email = script.params['email'] as String; + final email = script.params['email']! as String; return {'email': email, 'status': 'done'}; }, checkpoints: [ - FlowStep( + WorkflowCheckpoint( name: 'create-user', title: 'Create user', - kind: WorkflowStepKind.task, taskNames: const ['user.create'], - handler: (context) async => {'id': '1'}, ), - FlowStep( + WorkflowCheckpoint( name: 'send-welcome-email', title: 'Send welcome email', - kind: WorkflowStepKind.task, taskNames: const ['email.send'], - handler: (context) async => null, ), ], ).definition; final manifest = definition.toManifestEntry(); expect(manifest.kind, equals(WorkflowDefinitionKind.script)); - expect(manifest.stepCollectionLabel, equals('checkpoints')); expect(manifest.checkpoints, hasLength(2)); - expect(manifest.steps, hasLength(2)); - expect(manifest.steps.first.name, equals('create-user')); - expect(manifest.steps.first.position, equals(0)); + expect(manifest.steps, isEmpty); + expect(manifest.checkpoints.first.name, equals('create-user')); + expect(manifest.checkpoints.first.position, equals(0)); + expect( + manifest.checkpoints.first.taskNames, + equals(const ['user.create']), + ); + expect(manifest.checkpoints.last.name, equals('send-welcome-email')); + expect(manifest.checkpoints.last.position, equals(1)); expect( - manifest.steps.first.role, - equals(WorkflowManifestStepRole.scriptCheckpoint), + manifest.checkpoints.last.taskNames, + equals(const ['email.send']), ); - expect(manifest.steps.first.taskNames, equals(const ['user.create'])); - expect(manifest.steps.last.name, equals('send-welcome-email')); - expect(manifest.steps.last.position, equals(1)); - expect(manifest.steps.last.taskNames, equals(const ['email.send'])); }); } diff --git a/packages/stem/test/unit/workflow/workflow_metadata_views_test.dart b/packages/stem/test/unit/workflow/workflow_metadata_views_test.dart index 2302703f..5a4f9f23 100644 --- a/packages/stem/test/unit/workflow/workflow_metadata_views_test.dart +++ b/packages/stem/test/unit/workflow/workflow_metadata_views_test.dart @@ -49,6 +49,27 @@ void main() { state.suspensionPayload, equals(const {'invoiceId': 'inv-1'}), ); + expect( + state.suspensionPayloadJson<_InvoicePayload>( + decode: _InvoicePayload.fromJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-1', + ), + ); + expect( + state.suspensionPayloadVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-1', + ), + ); }); test('exposes runtime queue and serialization metadata', () { @@ -77,6 +98,17 @@ void main() { ); expect(state.workflowParams, equals(const {'tenant': 'acme'})); + expect( + state.paramsJson<_TenantPayload>(decode: _TenantPayload.fromJson), + isA<_TenantPayload>().having((value) => value.tenant, 'tenant', 'acme'), + ); + expect( + state.paramsVersionedJson<_TenantPayload>( + version: 2, + decode: _TenantPayload.fromVersionedJson, + ), + isA<_TenantPayload>().having((value) => value.tenant, 'tenant', 'acme'), + ); expect(state.orchestrationQueue, equals('workflow')); expect(state.continuationQueue, equals('workflow-continue')); expect(state.executionQueue, equals('workflow-step')); @@ -87,6 +119,113 @@ void main() { expect(state.encryptionScope, equals('signed-envelope')); expect(state.encryptionEnabled, isTrue); expect(state.streamId, equals('invoice_run-2')); + expect( + state.runtimeJson<_RuntimePayload>(decode: _RuntimePayload.fromJson), + isA<_RuntimePayload>() + .having( + (value) => value.orchestrationQueue, + 'orchestrationQueue', + 'workflow', + ) + .having((value) => value.streamId, 'streamId', 'invoice_run-2'), + ); + expect( + state.runtimeVersionedJson<_RuntimePayload>( + version: 2, + decode: _RuntimePayload.fromVersionedJson, + ), + isA<_RuntimePayload>() + .having( + (value) => value.orchestrationQueue, + 'orchestrationQueue', + 'workflow', + ) + .having((value) => value.streamId, 'streamId', 'invoice_run-2'), + ); + }); + + test('decodes raw result payloads as DTOs', () { + final state = RunState( + id: 'run-3', + workflow: 'invoice', + status: WorkflowStatus.completed, + cursor: 2, + params: const {'tenant': 'acme'}, + createdAt: DateTime.utc(2026, 2, 25), + result: const {'invoiceId': 'inv-2'}, + ); + + expect( + state.resultJson<_InvoicePayload>( + decode: _InvoicePayload.fromJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-2', + ), + ); + expect( + state.resultVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-2', + ), + ); + }); + + test('decodes last-error and cancellation payloads as DTOs', () { + final state = RunState( + id: 'run-4', + workflow: 'invoice', + status: WorkflowStatus.cancelled, + cursor: 2, + params: const {'tenant': 'acme'}, + createdAt: DateTime.utc(2026, 2, 25), + lastError: const { + PayloadCodec.versionKey: 2, + 'message': 'boom', + }, + cancellationData: const { + PayloadCodec.versionKey: 2, + 'reason': 'manual', + }, + ); + + expect( + state.lastErrorJson<_WorkflowErrorPayload>( + decode: _WorkflowErrorPayload.fromJson, + ), + isA<_WorkflowErrorPayload>() + .having((value) => value.message, 'message', 'boom'), + ); + expect( + state.lastErrorVersionedJson<_WorkflowErrorPayload>( + version: 2, + decode: _WorkflowErrorPayload.fromVersionedJson, + ), + isA<_WorkflowErrorPayload>() + .having((value) => value.message, 'message', 'boom'), + ); + expect( + state.cancellationDataJson<_CancellationPayload>( + decode: _CancellationPayload.fromJson, + ), + isA<_CancellationPayload>() + .having((value) => value.reason, 'reason', 'manual'), + ); + expect( + state.cancellationDataVersionedJson<_CancellationPayload>( + version: 2, + decode: _CancellationPayload.fromVersionedJson, + ), + isA<_CancellationPayload>() + .having((value) => value.reason, 'reason', 'manual'), + ); }); }); @@ -99,10 +238,12 @@ void main() { createdAt: DateTime.utc(2026, 2, 25), deadline: DateTime.utc(2026, 2, 25, 0, 15), data: const { + PayloadCodec.versionKey: 2, 'type': 'event', 'iteration': 2, 'iterationStep': 'approval#2', 'payload': {'invoiceId': 'inv-1'}, + 'topic': 'invoice.approved', 'suspendedAt': '2026-02-25T00:01:00Z', 'requestedResumeAt': '2026-02-25T00:02:00Z', 'policyDeadline': '2026-02-25T00:15:00Z', @@ -113,10 +254,12 @@ void main() { stepName: 'awaitApproval', topic: 'invoice.approved', resumeData: const { + PayloadCodec.versionKey: 2, 'type': 'event', 'iteration': 2, 'iterationStep': 'approval#2', 'payload': {'invoiceId': 'inv-1'}, + 'topic': 'invoice.approved', 'deliveredAt': '2026-02-25T00:01:30Z', }, ); @@ -125,6 +268,42 @@ void main() { expect(watcher.iteration, equals(2)); expect(watcher.iterationStep, equals('approval#2')); expect(watcher.payload, equals(const {'invoiceId': 'inv-1'})); + expect( + watcher.payloadJson<_InvoicePayload>( + decode: _InvoicePayload.fromJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-1', + ), + ); + expect( + watcher.payloadVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-1', + ), + ); + expect( + watcher.dataJson<_WatcherMetadata>(decode: _WatcherMetadata.fromJson), + isA<_WatcherMetadata>() + .having((value) => value.topic, 'topic', 'invoice.approved') + .having((value) => value.invoiceId, 'invoiceId', 'inv-1'), + ); + expect( + watcher.dataVersionedJson<_WatcherMetadata>( + version: 2, + decode: _WatcherMetadata.fromVersionedJson, + ), + isA<_WatcherMetadata>() + .having((value) => value.topic, 'topic', 'invoice.approved') + .having((value) => value.invoiceId, 'invoiceId', 'inv-1'), + ); expect(watcher.suspendedAt, equals(DateTime.utc(2026, 2, 25, 0, 1))); expect( watcher.requestedResumeAt, @@ -139,6 +318,44 @@ void main() { expect(resolution.iteration, equals(2)); expect(resolution.iterationStep, equals('approval#2')); expect(resolution.payload, equals(const {'invoiceId': 'inv-1'})); + expect( + resolution.payloadJson<_InvoicePayload>( + decode: _InvoicePayload.fromJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-1', + ), + ); + expect( + resolution.payloadVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-1', + ), + ); + expect( + resolution.resumeDataJson<_WatcherMetadata>( + decode: _WatcherMetadata.fromJson, + ), + isA<_WatcherMetadata>() + .having((value) => value.topic, 'topic', 'invoice.approved') + .having((value) => value.invoiceId, 'invoiceId', 'inv-1'), + ); + expect( + resolution.resumeDataVersionedJson<_WatcherMetadata>( + version: 2, + decode: _WatcherMetadata.fromVersionedJson, + ), + isA<_WatcherMetadata>() + .having((value) => value.topic, 'topic', 'invoice.approved') + .having((value) => value.invoiceId, 'invoiceId', 'inv-1'), + ); expect( resolution.deliveredAt, equals(DateTime.utc(2026, 2, 25, 0, 1, 30)), @@ -150,15 +367,332 @@ void main() { test('parses base name and iteration suffix', () { const step = WorkflowStepEntry( name: 'approval#3', - value: 'ok', + value: {'invoiceId': 'inv-3'}, position: 2, ); - const plain = WorkflowStepEntry(name: 'finalize', value: null, position: 3); + const plain = WorkflowStepEntry( + name: 'finalize', + value: null, + position: 3, + ); expect(step.baseName, equals('approval')); expect(step.iteration, equals(3)); + expect( + step.valueJson<_InvoicePayload>( + decode: _InvoicePayload.fromJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-3', + ), + ); + expect( + step.valueVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-3', + ), + ); expect(plain.baseName, equals('finalize')); expect(plain.iteration, isNull); }); }); + + group('Workflow view decode helpers', () { + test('decodes run results and suspension payloads as DTOs', () { + final state = RunState( + id: 'run-view-1', + workflow: 'invoice', + status: WorkflowStatus.suspended, + cursor: 2, + params: const { + PayloadCodec.versionKey: 2, + 'invoiceId': 'inv-params', + 'tenant': 'acme', + '__stem.workflow.runtime': { + PayloadCodec.versionKey: 2, + 'workflowId': 'abc123', + 'orchestrationQueue': 'workflow', + 'continuationQueue': 'workflow-continue', + 'executionQueue': 'workflow-step', + 'serializationFormat': 'json', + 'serializationVersion': '1', + 'frameFormat': 'stem-envelope', + 'frameVersion': '1', + 'encryptionScope': 'signed-envelope', + 'encryptionEnabled': true, + 'streamId': 'invoice_run-2', + }, + }, + createdAt: DateTime.utc(2026, 2, 25), + result: const {'invoiceId': 'inv-4'}, + lastError: const { + PayloadCodec.versionKey: 2, + 'message': 'boom', + }, + suspensionData: const { + 'type': 'event', + 'payload': {'invoiceId': 'inv-5'}, + }, + ); + final view = WorkflowRunView.fromState(state); + + expect( + view.paramsJson<_InvoicePayload>(decode: _InvoicePayload.fromJson), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-params', + ), + ); + expect( + view.paramsVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-params', + ), + ); + expect( + view.resultJson<_InvoicePayload>(decode: _InvoicePayload.fromJson), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-4', + ), + ); + expect( + view.resultVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-4', + ), + ); + expect( + view.suspensionPayloadJson<_InvoicePayload>( + decode: _InvoicePayload.fromJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-5', + ), + ); + expect( + view.suspensionPayloadVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-5', + ), + ); + expect( + view.lastErrorJson<_WorkflowErrorPayload>( + decode: _WorkflowErrorPayload.fromJson, + ), + isA<_WorkflowErrorPayload>() + .having((value) => value.message, 'message', 'boom'), + ); + expect( + view.lastErrorVersionedJson<_WorkflowErrorPayload>( + version: 2, + decode: _WorkflowErrorPayload.fromVersionedJson, + ), + isA<_WorkflowErrorPayload>() + .having((value) => value.message, 'message', 'boom'), + ); + expect( + view.runtimeJson<_RuntimePayload>(decode: _RuntimePayload.fromJson), + isA<_RuntimePayload>() + .having( + (value) => value.orchestrationQueue, + 'orchestrationQueue', + 'workflow', + ) + .having((value) => value.streamId, 'streamId', 'invoice_run-2'), + ); + expect( + view.runtimeVersionedJson<_RuntimePayload>( + version: 2, + decode: _RuntimePayload.fromVersionedJson, + ), + isA<_RuntimePayload>() + .having( + (value) => value.orchestrationQueue, + 'orchestrationQueue', + 'workflow', + ) + .having((value) => value.streamId, 'streamId', 'invoice_run-2'), + ); + }); + + test('decodes checkpoint values as DTOs', () { + const entry = WorkflowStepEntry( + name: 'approval#1', + value: {'invoiceId': 'inv-6'}, + position: 1, + ); + final view = WorkflowCheckpointView.fromEntry( + runId: 'run-view-2', + workflow: 'invoice', + entry: entry, + ); + + expect( + view.valueJson<_InvoicePayload>(decode: _InvoicePayload.fromJson), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-6', + ), + ); + expect( + view.valueVersionedJson<_InvoicePayload>( + version: 2, + decode: _InvoicePayload.fromVersionedJson, + ), + isA<_InvoicePayload>().having( + (value) => value.invoiceId, + 'invoiceId', + 'inv-6', + ), + ); + }); + }); +} + +class _InvoicePayload { + const _InvoicePayload({required this.invoiceId}); + + factory _InvoicePayload.fromJson(Map json) { + return _InvoicePayload(invoiceId: json['invoiceId'] as String); + } + + factory _InvoicePayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _InvoicePayload(invoiceId: json['invoiceId'] as String); + } + + final String invoiceId; +} + +class _TenantPayload { + const _TenantPayload({required this.tenant}); + + factory _TenantPayload.fromJson(Map json) { + return _TenantPayload(tenant: json['tenant'] as String); + } + + factory _TenantPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _TenantPayload.fromJson(json); + } + + final String tenant; +} + +class _WatcherMetadata { + const _WatcherMetadata({required this.topic, required this.invoiceId}); + + factory _WatcherMetadata.fromJson(Map json) { + final payload = json['payload'] as Map; + return _WatcherMetadata( + topic: json['topic'] as String, + invoiceId: payload['invoiceId'] as String, + ); + } + + factory _WatcherMetadata.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WatcherMetadata.fromJson(json); + } + + final String topic; + final String invoiceId; +} + +class _RuntimePayload { + const _RuntimePayload({ + required this.orchestrationQueue, + required this.streamId, + }); + + factory _RuntimePayload.fromJson(Map json) { + return _RuntimePayload( + orchestrationQueue: json['orchestrationQueue'] as String, + streamId: json['streamId'] as String, + ); + } + + factory _RuntimePayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _RuntimePayload.fromJson(json); + } + + final String orchestrationQueue; + final String streamId; +} + +class _WorkflowErrorPayload { + const _WorkflowErrorPayload({required this.message}); + + factory _WorkflowErrorPayload.fromJson(Map json) { + return _WorkflowErrorPayload(message: json['message'] as String); + } + + factory _WorkflowErrorPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WorkflowErrorPayload.fromJson(json); + } + + final String message; +} + +class _CancellationPayload { + const _CancellationPayload({required this.reason}); + + factory _CancellationPayload.fromJson(Map json) { + return _CancellationPayload(reason: json['reason'] as String); + } + + factory _CancellationPayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _CancellationPayload.fromJson(json); + } + + final String reason; } diff --git a/packages/stem/test/unit/workflow/workflow_result_test.dart b/packages/stem/test/unit/workflow/workflow_result_test.dart index ac12143b..24ada74c 100644 --- a/packages/stem/test/unit/workflow/workflow_result_test.dart +++ b/packages/stem/test/unit/workflow/workflow_result_test.dart @@ -1,6 +1,4 @@ -import 'package:stem/src/workflow/core/run_state.dart'; -import 'package:stem/src/workflow/core/workflow_result.dart'; -import 'package:stem/src/workflow/core/workflow_status.dart'; +import 'package:stem/stem.dart'; import 'package:test/test.dart'; void main() { @@ -25,4 +23,116 @@ void main() { expect(result.isCompleted, isTrue); expect(result.isFailed, isFalse); }); + + test('WorkflowResult exposes typed value helpers', () { + final state = RunState( + id: 'run-1', + workflow: 'demo', + status: WorkflowStatus.completed, + cursor: 0, + params: const {}, + createdAt: DateTime.utc(2025), + updatedAt: DateTime.utc(2025), + ); + final result = WorkflowResult( + runId: 'run-1', + status: WorkflowStatus.completed, + state: state, + value: 42, + rawResult: 42, + ); + + expect(result.valueOr(7), 42); + expect(result.requiredValue(), 42); + }); + + test('WorkflowResult exposes raw payload decode helpers', () { + final state = RunState( + id: 'run-1', + workflow: 'demo', + status: WorkflowStatus.completed, + cursor: 0, + params: const {}, + createdAt: DateTime.utc(2025), + updatedAt: DateTime.utc(2025), + ); + final codec = PayloadCodec>.map( + encode: (value) => value, + decode: (json) => json, + typeName: 'ReceiptMap', + ); + final result = WorkflowResult( + runId: 'run-1', + status: WorkflowStatus.completed, + state: state, + rawResult: const {'id': 'receipt-1'}, + ); + + expect( + result.payloadAs>(codec: codec), + equals(const {'id': 'receipt-1'}), + ); + expect( + result.payloadJson<_WorkflowReceipt>( + decode: _WorkflowReceipt.fromJson, + ), + isA<_WorkflowReceipt>() + .having((value) => value.id, 'id', 'receipt-1'), + ); + expect( + result.payloadVersionedJson<_WorkflowReceipt>( + version: 2, + decode: _WorkflowReceipt.fromVersionedJson, + ), + isA<_WorkflowReceipt>() + .having((value) => value.id, 'id', 'receipt-1'), + ); + }); + + test('WorkflowResult.requiredValue throws when value is absent', () { + final state = RunState( + id: 'run-1', + workflow: 'demo', + status: WorkflowStatus.failed, + cursor: 0, + params: const {}, + createdAt: DateTime.utc(2025), + updatedAt: DateTime.utc(2025), + ); + final result = WorkflowResult( + runId: 'run-1', + status: WorkflowStatus.failed, + state: state, + ); + + expect( + result.requiredValue, + throwsA( + isA().having( + (error) => error.message, + 'message', + contains('run-1'), + ), + ), + ); + expect(result.valueOr(7), 7); + }); +} + +class _WorkflowReceipt { + const _WorkflowReceipt({required this.id}); + + factory _WorkflowReceipt.fromJson(Map json) { + return _WorkflowReceipt(id: json['id'] as String); + } + + factory _WorkflowReceipt.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _WorkflowReceipt(id: json['id'] as String); + } + + final String id; } diff --git a/packages/stem/test/unit/workflow/workflow_resume_test.dart b/packages/stem/test/unit/workflow/workflow_resume_test.dart index 6c8c3dcd..bab01090 100644 --- a/packages/stem/test/unit/workflow/workflow_resume_test.dart +++ b/packages/stem/test/unit/workflow/workflow_resume_test.dart @@ -1,7 +1,16 @@ +import 'dart:async'; + import 'package:stem/src/core/contracts.dart'; import 'package:stem/src/core/payload_codec.dart'; import 'package:stem/src/workflow/core/flow_context.dart'; +import 'package:stem/src/workflow/core/flow_step.dart'; +import 'package:stem/src/workflow/core/workflow_cancellation_policy.dart'; +import 'package:stem/src/workflow/core/workflow_event_ref.dart'; +import 'package:stem/src/workflow/core/workflow_execution_context.dart'; +import 'package:stem/src/workflow/core/workflow_ref.dart'; +import 'package:stem/src/workflow/core/workflow_result.dart'; import 'package:stem/src/workflow/core/workflow_resume.dart'; +import 'package:stem/src/workflow/core/workflow_resume_context.dart'; import 'package:stem/src/workflow/core/workflow_script_context.dart'; import 'package:test/test.dart'; @@ -40,13 +49,1068 @@ void main() { expect(value!['approvedBy'], 'gateway'); expect(context.takeResumeValue>(), isNull); }); + + test('FlowContext.takeResumeJson decodes DTO payloads', () { + final context = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + + final value = context.takeResumeJson<_ResumePayload>( + decode: _ResumePayload.fromJson, + ); + + expect(value, isNotNull); + expect(value!.message, 'approved'); + expect( + context.takeResumeJson<_ResumePayload>( + decode: _ResumePayload.fromJson, + ), + isNull, + ); + }); + + test( + 'FlowContext.takeResumeVersionedJson decodes versioned DTO payloads', + () { + final context = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const { + PayloadCodec.versionKey: 2, + 'message': 'approved', + }, + ); + + final value = context.takeResumeVersionedJson<_ResumePayload>( + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ); + + expect(value, isNotNull); + expect(value!.message, 'approved'); + expect( + context.takeResumeVersionedJson<_ResumePayload>( + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ), + isNull, + ); + }, + ); + + test( + 'WorkflowExecutionContext.previousValue reads typed previous results', + () { + final flowContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'tail', + params: const {}, + previousResult: 'approved', + stepIndex: 1, + ); + final scriptContext = _FakeWorkflowScriptStepContext( + previousResult: 'emailed', + ); + + expect(flowContext.previousValue(), 'approved'); + expect(scriptContext.previousValue(), 'emailed'); + }, + ); + + test( + 'WorkflowExecutionContext.requiredParam reads typed workflow params', + () { + final flowContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'draft', + params: const {'documentId': 'doc-42'}, + previousResult: null, + stepIndex: 0, + ); + final scriptContext = _FakeWorkflowScriptStepContext( + params: const {'documentId': 'doc-43'}, + ); + + expect(flowContext.requiredParam('documentId'), 'doc-42'); + expect(scriptContext.requiredParam('documentId'), 'doc-43'); + }, + ); + + test( + 'WorkflowScriptContext.requiredParam decodes codec-backed workflow params', + () { + final context = _FakeWorkflowScriptContext( + params: const { + 'payload': {'message': 'approved'}, + }, + ); + + final value = context.requiredParam<_ResumePayload>( + 'payload', + codec: _resumePayloadCodec, + ); + + expect(value.message, 'approved'); + }, + ); + + test( + 'workflow contexts decode whole workflow param DTOs', + () { + final flowContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'draft', + params: const { + PayloadCodec.versionKey: 2, + 'message': 'approved', + 'payload': { + PayloadCodec.versionKey: 2, + 'message': 'approved', + }, + }, + previousResult: null, + stepIndex: 0, + ); + final scriptContext = _FakeWorkflowScriptContext( + params: const { + PayloadCodec.versionKey: 2, + 'message': 'queued', + 'payload': { + PayloadCodec.versionKey: 2, + 'message': 'queued', + }, + }, + ); + + expect( + flowContext + .paramsJson<_ResumePayload>( + decode: _ResumePayload.fromJson, + ) + .message, + 'approved', + ); + expect( + flowContext + .paramsVersionedJson<_ResumePayload>( + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ) + .message, + 'approved', + ); + expect( + flowContext + .paramVersionedJson<_ResumePayload>( + 'payload', + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ) + ?.message, + 'approved', + ); + expect( + scriptContext + .paramsAs<_ResumePayload>(codec: _resumePayloadCodec) + .message, + 'queued', + ); + expect( + scriptContext + .paramsVersionedJson<_ResumePayload>( + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ) + .message, + 'queued', + ); + expect( + scriptContext + .paramVersionedJson<_ResumePayload>( + 'payload', + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ) + ?.message, + 'queued', + ); + }, + ); + + test( + 'WorkflowExecutionContext.requiredPreviousValue ' + 'decodes codec-backed values', + () { + final flowContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'tail', + params: const {}, + previousResult: const {'message': 'approved'}, + stepIndex: 1, + ); + + final value = flowContext.requiredPreviousValue<_ResumePayload>( + codec: _resumePayloadCodec, + ); + + expect(value.message, 'approved'); + }, + ); + + test( + 'WorkflowExecutionContext.requiredPreviousJson decodes DTO values', + () { + final flowContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'tail', + params: const {}, + previousResult: const {'message': 'approved'}, + stepIndex: 1, + ); + + final value = flowContext.requiredPreviousJson<_ResumePayload>( + decode: _ResumePayload.fromJson, + ); + + expect(value.message, 'approved'); + }, + ); + + test( + 'WorkflowExecutionContext.requiredPreviousVersionedJson decodes DTO values', + () { + final flowContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'tail', + params: const {}, + previousResult: const { + PayloadCodec.versionKey: 2, + 'message': 'approved', + }, + stepIndex: 1, + ); + + final value = flowContext.requiredPreviousVersionedJson<_ResumePayload>( + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ); + + expect(value.message, 'approved'); + }, + ); + + test('FlowContext.sleepUntilResumed suspends once then resumes', () { + final firstContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + final firstResult = firstContext.sleepUntilResumed( + const Duration(seconds: 1), + data: const {'phase': 'initial'}, + ); + + expect(firstResult, isFalse); + final control = firstContext.takeControl(); + expect(control, isNotNull); + expect(control!.type, FlowControlType.sleep); + + final resumedContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: true, + ); + + final resumed = resumedContext.sleepUntilResumed( + const Duration(seconds: 1), + ); + + expect(resumed, isTrue); + expect(resumedContext.takeControl(), isNull); + }); + + test('FlowContext.sleepFor uses named args and throws suspension signal', () { + final firstContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + expect( + () => firstContext.sleepFor(duration: const Duration(seconds: 1)), + throwsA(isA()), + ); + expect(firstContext.takeControl()?.type, FlowControlType.sleep); + + final resumedContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: true, + ); + + expect( + resumedContext.sleepFor(duration: const Duration(seconds: 1)), + completes, + ); + expect(resumedContext.takeControl(), isNull); + }); + + test( + 'FlowContext.waitForEventValue registers watcher then decodes payload', + () { + final firstContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + final firstResult = firstContext.waitForEventValue<_ResumePayload>( + 'demo.event', + codec: _resumePayloadCodec, + ); + + expect(firstResult, isNull); + final control = firstContext.takeControl(); + expect(control, isNotNull); + expect(control!.type, FlowControlType.waitForEvent); + expect(control.topic, 'demo.event'); + + final resumedContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + + final resumed = resumedContext.waitForEventValue<_ResumePayload>( + 'demo.event', + codec: _resumePayloadCodec, + ); + + expect(resumed, isNotNull); + expect(resumed!.message, 'approved'); + expect(resumedContext.takeControl(), isNull); + }, + ); + + test( + 'FlowContext.waitForEventValueJson registers watcher ' + 'then decodes DTO payload', + () { + final firstContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + final firstResult = firstContext.waitForEventValueJson<_ResumePayload>( + 'demo.event', + decode: _ResumePayload.fromJson, + ); + + expect(firstResult, isNull); + final control = firstContext.takeControl(); + expect(control, isNotNull); + expect(control!.type, FlowControlType.waitForEvent); + expect(control.topic, 'demo.event'); + + final resumedContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + + final resumed = resumedContext.waitForEventValueJson<_ResumePayload>( + 'demo.event', + decode: _ResumePayload.fromJson, + ); + + expect(resumed, isNotNull); + expect(resumed!.message, 'approved'); + expect(resumedContext.takeControl(), isNull); + }, + ); + + test( + 'FlowContext.waitForEventValueVersionedJson registers watcher ' + 'then decodes DTO payload', + () { + final firstContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + final firstResult = + firstContext.waitForEventValueVersionedJson<_ResumePayload>( + 'demo.event', + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ); + + expect(firstResult, isNull); + final control = firstContext.takeControl(); + expect(control, isNotNull); + expect(control!.type, FlowControlType.waitForEvent); + expect(control.topic, 'demo.event'); + + final resumedContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const { + PayloadCodec.versionKey: 2, + 'message': 'approved', + }, + ); + + final resumed = + resumedContext.waitForEventValueVersionedJson<_ResumePayload>( + 'demo.event', + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ); + + expect(resumed, isNotNull); + expect(resumed!.message, 'approved'); + expect(resumedContext.takeControl(), isNull); + }, + ); + + test('WorkflowEventRef.waitValue reuses topic and codec for flows', () { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + final firstContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + final firstResult = event.waitValue(firstContext); + + expect(firstResult, isNull); + final control = firstContext.takeControl(); + expect(control, isNotNull); + expect(control!.topic, 'demo.event'); + + final resumedContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + + final resumed = event.waitValue(resumedContext); + expect(resumed?.message, 'approved'); + }); + + test( + 'WorkflowEventRef.wait uses named args and resumes with payload in flows', + () { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + final waiting = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + expect( + () => event.wait(waiting), + throwsA(isA()), + ); + expect(waiting.takeControl()?.topic, 'demo.event'); + + final resumed = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + + expect( + event.wait(resumed), + completion( + isA<_ResumePayload>().having( + (value) => value.message, + 'message', + 'approved', + ), + ), + ); + }, + ); + + test('FlowContext.waitForEvent uses named args and resumes with payload', () { + final waiting = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + expect( + () => waiting.waitForEvent<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ), + throwsA(isA()), + ); + expect(waiting.takeControl()?.topic, 'demo.event'); + + final resumed = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + + expect( + resumed.waitForEvent<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ), + completion( + isA<_ResumePayload>().having( + (value) => value.message, + 'message', + 'approved', + ), + ), + ); + }); + + test( + 'FlowContext.waitForEventJson uses named args and resumes with DTO payload', + () { + final waiting = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + expect( + () => waiting.waitForEventJson<_ResumePayload>( + topic: 'demo.event', + decode: _ResumePayload.fromJson, + ), + throwsA(isA()), + ); + expect(waiting.takeControl()?.topic, 'demo.event'); + + final resumed = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + + expect( + resumed.waitForEventJson<_ResumePayload>( + topic: 'demo.event', + decode: _ResumePayload.fromJson, + ), + completion( + isA<_ResumePayload>().having( + (value) => value.message, + 'message', + 'approved', + ), + ), + ); + }, + ); + + test( + 'FlowContext.waitForEventVersionedJson uses named args and resumes ' + 'with DTO payload', + () { + final waiting = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + expect( + () => waiting.waitForEventVersionedJson<_ResumePayload>( + topic: 'demo.event', + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ), + throwsA(isA()), + ); + expect(waiting.takeControl()?.topic, 'demo.event'); + + final resumed = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const { + PayloadCodec.versionKey: 2, + 'message': 'approved', + }, + ); + + expect( + resumed.waitForEventVersionedJson<_ResumePayload>( + topic: 'demo.event', + defaultVersion: 2, + decode: _ResumePayload.fromVersionedJson, + ), + completion( + isA<_ResumePayload>().having( + (value) => value.message, + 'message', + 'approved', + ), + ), + ); + }, + ); + + test( + 'WorkflowScriptStepContext JSON suspension helpers encode DTO payloads', + () async { + final context = _FakeWorkflowScriptStepContext(); + + await context.sleepJson( + const Duration(seconds: 2), + const _SuspensionPayload(stage: 'sleeping'), + ); + await context.awaitEventJson( + 'topic', + const _SuspensionPayload(stage: 'waiting'), + deadline: DateTime.parse('2025-01-01T00:00:00Z'), + ); + await context.sleepVersionedJson( + const Duration(seconds: 3), + const _SuspensionPayload(stage: 'versioned-sleep'), + version: 2, + ); + await context.awaitEventVersionedJson( + 'topic.versioned', + const _SuspensionPayload(stage: 'versioned-wait'), + version: 2, + deadline: DateTime.parse('2025-01-01T00:00:01Z'), + ); + + expect( + context.sleepCalls, + equals([const Duration(seconds: 2), const Duration(seconds: 3)]), + ); + expect(context.awaitedTopics, equals(['topic', 'topic.versioned'])); + expect(context.awaitedData, { + PayloadCodec.versionKey: 2, + 'stage': 'versioned-wait', + }); + expect(context.awaitedDeadline, DateTime.parse('2025-01-01T00:00:01Z')); + }, + ); + + test('WorkflowEventRef.awaitOn reuses the event topic for flows', () { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + final deadline = DateTime.parse('2026-01-01T00:00:00Z'); + final context = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + + final control = event.awaitOn( + context, + deadline: deadline, + data: const {'source': 'flow'}, + ); + + expect(control.type, FlowControlType.waitForEvent); + expect(control.topic, 'demo.event'); + expect(control.deadline, deadline); + expect(control.data, containsPair('source', 'flow')); + }); + + test( + 'WorkflowScriptStepContext helpers suspend once and decode resumed values', + () { + final sleeping = _FakeWorkflowScriptStepContext(); + + final firstSleep = sleeping.sleepUntilResumed( + const Duration(milliseconds: 10), + ); + + expect(firstSleep, isFalse); + expect(sleeping.sleepCalls, hasLength(1)); + + final resumedSleep = _FakeWorkflowScriptStepContext(resumeData: true); + expect( + resumedSleep.sleepUntilResumed(const Duration(milliseconds: 10)), + isTrue, + ); + expect(resumedSleep.sleepCalls, isEmpty); + + final waiting = _FakeWorkflowScriptStepContext(); + final firstEvent = waiting.waitForEventValue<_ResumePayload>( + 'demo.event', + codec: _resumePayloadCodec, + ); + expect(firstEvent, isNull); + expect(waiting.awaitedTopics, ['demo.event']); + + final resumedEvent = _FakeWorkflowScriptStepContext( + resumeData: const {'message': 'approved'}, + ); + final resumedValue = resumedEvent.waitForEventValue<_ResumePayload>( + 'demo.event', + codec: _resumePayloadCodec, + ); + expect(resumedValue, isNotNull); + expect(resumedValue!.message, 'approved'); + expect(resumedEvent.awaitedTopics, isEmpty); + }, + ); + + test( + 'WorkflowScriptStepContext expression helpers use named args and ' + 'throw suspension signal', + () { + final sleeping = _FakeWorkflowScriptStepContext(); + expect( + sleeping.sleepFor(duration: const Duration(milliseconds: 10)), + throwsA(isA()), + ); + expect(sleeping.sleepCalls, [const Duration(milliseconds: 10)]); + + final resumedSleep = _FakeWorkflowScriptStepContext(resumeData: true); + expect( + resumedSleep.sleepFor(duration: const Duration(milliseconds: 10)), + completes, + ); + expect(resumedSleep.sleepCalls, isEmpty); + + final waiting = _FakeWorkflowScriptStepContext(); + expect( + waiting.waitForEvent<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ), + throwsA(isA()), + ); + expect(waiting.awaitedTopics, ['demo.event']); + + final resumedEvent = _FakeWorkflowScriptStepContext( + resumeData: const {'message': 'approved'}, + ); + expect( + resumedEvent.waitForEvent<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ), + completion( + isA<_ResumePayload>().having( + (value) => value.message, + 'message', + 'approved', + ), + ), + ); + }, + ); + + test('WorkflowEventRef.waitValue reuses topic and codec in scripts', () { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + final waiting = _FakeWorkflowScriptStepContext(); + final firstEvent = event.waitValue(waiting); + expect(firstEvent, isNull); + expect(waiting.awaitedTopics, ['demo.event']); + + final resumed = _FakeWorkflowScriptStepContext( + resumeData: const {'message': 'approved'}, + ); + final resumedValue = event.waitValue(resumed); + expect(resumedValue?.message, 'approved'); + }); + + test( + 'WorkflowEventRef.wait uses named args and resumes with payload in scripts', + () { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + final waiting = _FakeWorkflowScriptStepContext(); + + expect( + event.wait(waiting), + throwsA(isA()), + ); + expect(waiting.awaitedTopics, ['demo.event']); + + final resumed = _FakeWorkflowScriptStepContext( + resumeData: const {'message': 'approved'}, + ); + expect( + event.wait(resumed), + completion( + isA<_ResumePayload>().having( + (value) => value.message, + 'message', + 'approved', + ), + ), + ); + }, + ); + + test( + 'WorkflowEventRef.waitValue delegates to both flow and script ' + 'contexts', + () { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + + final flowWaiting = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + expect(event.waitValue(flowWaiting), isNull); + expect(flowWaiting.takeControl()?.topic, 'demo.event'); + + final flowResumed = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + resumeData: const {'message': 'approved'}, + ); + expect(event.waitValue(flowResumed)?.message, 'approved'); + + final scriptWaiting = _FakeWorkflowScriptStepContext(); + expect(event.waitValue(scriptWaiting), isNull); + expect(scriptWaiting.awaitedTopics, ['demo.event']); + }, + ); + + test( + 'WorkflowEventRef.wait delegates to both flow and script contexts', + () { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + + final flowWaiting = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + expect( + () => event.wait(flowWaiting), + throwsA(isA()), + ); + expect(flowWaiting.takeControl()?.topic, 'demo.event'); + + final scriptResumed = _FakeWorkflowScriptStepContext( + resumeData: const {'message': 'approved'}, + ); + expect( + event.wait(scriptResumed), + completion( + isA<_ResumePayload>().having( + (value) => value.message, + 'message', + 'approved', + ), + ), + ); + }, + ); + + test('flow and script step contexts share the execution-context surface', () { + final flowContext = FlowContext( + workflow: 'demo', + runId: 'run-1', + stepName: 'wait', + params: const {}, + previousResult: null, + stepIndex: 0, + ); + final scriptContext = _FakeWorkflowScriptStepContext(); + + expect(flowContext, isA()); + expect(scriptContext, isA()); + expect(flowContext, isA()); + expect(scriptContext, isA()); + }); + + test( + 'WorkflowEventRef.wait registers script-step waits before suspension', + () async { + const event = WorkflowEventRef<_ResumePayload>( + topic: 'demo.event', + codec: _resumePayloadCodec, + ); + final deadline = DateTime.parse('2026-01-01T00:00:00Z'); + final context = _FakeWorkflowScriptStepContext(); + + await expectLater( + () => event.wait( + context, + deadline: deadline, + data: const {'source': 'script'}, + ), + throwsA(isA()), + ); + + expect(context.awaitedTopics, ['demo.event']); + expect(context.awaitedDeadline, deadline); + expect(context.awaitedData, containsPair('source', 'script')); + }, + ); + + test( + 'WorkflowScriptStepContext.enqueue delegates to the configured enqueuer', + () async { + final enqueuer = _RecordingTaskEnqueuer(); + final context = _FakeWorkflowScriptStepContext(enqueuer: enqueuer); + + final taskId = await context.enqueue( + 'tasks.child', + args: const {'value': 42}, + meta: const {'source': 'script'}, + ); + + expect(taskId, equals('recorded-1')); + expect(enqueuer.lastName, equals('tasks.child')); + expect(enqueuer.lastArgs, equals({'value': 42})); + expect(enqueuer.lastMeta, containsPair('source', 'script')); + }, + ); + + test( + 'WorkflowScriptStepContext.enqueue throws when no enqueuer is configured', + () { + final context = _FakeWorkflowScriptStepContext(); + + expect(() => context.enqueue('tasks.child'), throwsStateError); + }, + ); } class _ResumePayload { const _ResumePayload({required this.message}); factory _ResumePayload.fromJson(Map json) { - return _ResumePayload(message: json['message'] as String); + return _ResumePayload(message: json['message']! as String); + } + + factory _ResumePayload.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _ResumePayload(message: json['message']! as String); } final String message; @@ -62,28 +1126,49 @@ const _resumePayloadCodec = PayloadCodec<_ResumePayload>( Object? _encodeResumePayload(_ResumePayload value) => value.toJson(); _ResumePayload _decodeResumePayload(Object? payload) { + final map = payload! as Map; return _ResumePayload.fromJson( - Map.from(payload! as Map), + Map.from(map), ); } class _FakeWorkflowScriptStepContext implements WorkflowScriptStepContext { - _FakeWorkflowScriptStepContext({Object? resumeData}) - : _resumeData = resumeData; + _FakeWorkflowScriptStepContext({ + Object? resumeData, + Object? previousResult, + Map params = const {}, + TaskEnqueuer? enqueuer, + WorkflowCaller? workflows, + }) : _resumeData = resumeData, + _previousResult = previousResult, + _params = params, + _enqueuer = enqueuer, + _workflows = workflows; Object? _resumeData; + final Object? _previousResult; + final Map _params; + final TaskEnqueuer? _enqueuer; + final WorkflowCaller? _workflows; + final List awaitedTopics = []; + DateTime? awaitedDeadline; + Map? awaitedData; + final List sleepCalls = []; + + @override + TaskEnqueuer? get enqueuer => _enqueuer; @override - TaskEnqueuer? get enqueuer => null; + WorkflowCaller? get workflows => _workflows; @override int get iteration => 0; @override - Map get params => const {}; + Map get params => _params; @override - Object? get previousResult => null; + Object? get previousResult => _previousResult; @override String get runId => 'run-1'; @@ -102,14 +1187,37 @@ class _FakeWorkflowScriptStepContext implements WorkflowScriptStepContext { String topic, { DateTime? deadline, Map? data, - }) async {} + }) async { + awaitedTopics.add(topic); + awaitedDeadline = deadline; + awaitedData = data == null ? null : Map.from(data); + } + + @override + Future suspendFor( + Duration duration, { + Map? data, + }) { + return sleep(duration, data: data); + } + + @override + Future waitForTopic( + String topic, { + DateTime? deadline, + Map? data, + }) { + return awaitEvent(topic, deadline: deadline, data: data); + } @override String idempotencyKey([String? scope]) => 'demo.workflow/run-1/${scope ?? stepName}'; @override - Future sleep(Duration duration, {Map? data}) async {} + Future sleep(Duration duration, {Map? data}) async { + sleepCalls.add(duration); + } @override Object? takeResumeData() { @@ -117,4 +1225,234 @@ class _FakeWorkflowScriptStepContext implements WorkflowScriptStepContext { _resumeData = null; return value; } + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) async { + final delegate = _enqueuer; + if (delegate == null) { + throw StateError('WorkflowScriptStepContext has no enqueuer configured'); + } + return delegate.enqueue( + name, + args: args, + headers: headers, + meta: meta, + options: options, + notBefore: notBefore, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) async { + final delegate = _enqueuer; + if (delegate == null) { + throw StateError('WorkflowScriptStepContext has no enqueuer configured'); + } + return delegate.enqueueCall(call, enqueueOptions: enqueueOptions); + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) async { + final delegate = _enqueuer; + if (delegate == null) { + throw StateError('WorkflowScriptStepContext has no enqueuer configured'); + } + return delegate.enqueueValue( + name, + value, + codec: codec, + headers: headers, + meta: meta, + options: options, + notBefore: notBefore, + enqueueOptions: enqueueOptions, + ); + } + + @override + Future startWorkflowRef( + WorkflowRef definition, + TParams params, { + String? parentRunId, + Duration? ttl, + WorkflowCancellationPolicy? cancellationPolicy, + }) async { + final caller = _workflows; + if (caller == null) { + throw StateError( + 'WorkflowScriptStepContext has no workflow caller configured', + ); + } + return caller.startWorkflowRef( + definition, + params, + parentRunId: parentRunId, + ttl: ttl, + cancellationPolicy: cancellationPolicy, + ); + } + + @override + Future startWorkflowCall( + WorkflowStartCall call, + ) async { + final caller = _workflows; + if (caller == null) { + throw StateError( + 'WorkflowScriptStepContext has no workflow caller configured', + ); + } + return caller.startWorkflowCall(call); + } + + @override + Future?> + waitForWorkflowRef( + String runId, + WorkflowRef definition, { + Duration pollInterval = const Duration(milliseconds: 100), + Duration? timeout, + }) async { + final caller = _workflows; + if (caller == null) { + throw StateError( + 'WorkflowScriptStepContext has no workflow caller configured', + ); + } + return caller.waitForWorkflowRef( + runId, + definition, + pollInterval: pollInterval, + timeout: timeout, + ); + } +} + +class _FakeWorkflowScriptContext implements WorkflowScriptContext { + _FakeWorkflowScriptContext({required this.params}); + + @override + final Map params; + + @override + String get runId => 'run-1'; + + @override + String get workflow => 'demo.workflow'; + + @override + Future step( + String name, + FutureOr Function(WorkflowScriptStepContext context) handler, { + bool autoVersion = false, + }) { + return Future.error(UnimplementedError()); + } +} + +class _SuspensionPayload { + const _SuspensionPayload({required this.stage}); + + final String stage; + + Map toJson() => {'stage': stage}; +} + +class _RecordingTaskEnqueuer implements TaskEnqueuer { + String? lastName; + Map? lastArgs; + Map? lastMeta; + + @override + Future enqueue( + String name, { + Map args = const {}, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) async { + lastName = name; + lastArgs = Map.from(args); + lastMeta = Map.from(meta); + return 'recorded-1'; + } + + @override + Future enqueueCall( + TaskCall call, { + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + call.name, + args: call.encodeArgs(), + headers: call.headers, + meta: call.meta, + options: call.resolveOptions(), + notBefore: call.notBefore, + enqueueOptions: enqueueOptions ?? call.enqueueOptions, + ); + } + + @override + Future enqueueValue( + String name, + T value, { + PayloadCodec? codec, + Map headers = const {}, + Map meta = const {}, + TaskOptions options = const TaskOptions(), + DateTime? notBefore, + TaskEnqueueOptions? enqueueOptions, + }) { + return enqueue( + name, + args: _encodeWorkflowTaskArgs(name, value, codec: codec), + headers: headers, + meta: meta, + options: options, + notBefore: notBefore, + enqueueOptions: enqueueOptions, + ); + } +} + +Map _encodeWorkflowTaskArgs( + String name, + T value, { + PayloadCodec? codec, +}) { + final payload = codec == null ? value : codec.encode(value); + if (payload is Map) { + return Map.from(payload); + } + if (payload is Map) { + return payload.map((key, value) => MapEntry(key.toString(), value)); + } + throw StateError( + 'Task payload for $name must encode to Map, got ' + '${payload.runtimeType}.', + ); } diff --git a/packages/stem/test/workflow/workflow_runtime_call_extensions_test.dart b/packages/stem/test/workflow/workflow_runtime_call_extensions_test.dart new file mode 100644 index 00000000..c818a9bc --- /dev/null +++ b/packages/stem/test/workflow/workflow_runtime_call_extensions_test.dart @@ -0,0 +1,149 @@ +import 'package:stem/stem.dart'; +import 'package:test/test.dart'; + +void main() { + group('runtime workflow start call dispatch', () { + test( + 'buildStart() can be dispatched through WorkflowCaller', + () async { + final flow = Flow( + name: 'runtime.extension.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final workflowRef = WorkflowRef, String>( + name: 'runtime.extension.flow', + encodeParams: (params) => params, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final runId = await workflowApp.runtime.startWorkflowCall( + workflowRef.buildStart(params: const {'name': 'runtime'}), + ); + final waited = await workflowRef.waitFor( + workflowApp.runtime, + runId, + timeout: const Duration(seconds: 2), + ); + + expect(waited?.value, 'hello runtime'); + + final inlineCall = workflowRef.buildStart( + params: const {'name': 'inline'}, + ); + final inlineRunId = await workflowApp.runtime.startWorkflowCall( + inlineCall, + ); + final oneShot = await workflowRef.waitFor( + workflowApp.runtime, + inlineRunId, + timeout: const Duration(seconds: 2), + ); + + expect(oneShot?.value, 'hello inline'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'WorkflowRef direct helpers mirror WorkflowCaller startWorkflowCall', + () async { + final flow = Flow( + name: 'runtime.extension.direct.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final workflowRef = WorkflowRef, String>( + name: 'runtime.extension.direct.flow', + encodeParams: (params) => params, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final runId = await workflowRef.start( + workflowApp.runtime, + params: const {'name': 'runtime'}, + ); + final waited = await workflowRef.waitFor( + workflowApp.runtime, + runId, + timeout: const Duration(seconds: 2), + ); + + expect(waited?.value, 'hello runtime'); + + final oneShot = await workflowRef.startAndWait( + workflowApp.runtime, + params: const {'name': 'inline'}, + timeout: const Duration(seconds: 2), + ); + + expect(oneShot?.value, 'hello inline'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'named workflow start aliases mirror the direct workflow helpers', + () async { + final flow = Flow( + name: 'runtime.extension.named.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final workflowRef = WorkflowRef, String>( + name: 'runtime.extension.named.flow', + encodeParams: (params) => params, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final runId = await workflowRef.start( + workflowApp.runtime, + params: const {'name': 'runtime'}, + ); + final waited = await workflowRef.waitFor( + workflowApp.runtime, + runId, + timeout: const Duration(seconds: 2), + ); + + expect(waited?.value, 'hello runtime'); + + final oneShot = await workflowRef.startAndWait( + workflowApp.runtime, + params: const {'name': 'inline'}, + timeout: const Duration(seconds: 2), + ); + + expect(oneShot?.value, 'hello inline'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + }); +} diff --git a/packages/stem/test/workflow/workflow_runtime_ref_test.dart b/packages/stem/test/workflow/workflow_runtime_ref_test.dart new file mode 100644 index 00000000..e6ecbe7f --- /dev/null +++ b/packages/stem/test/workflow/workflow_runtime_ref_test.dart @@ -0,0 +1,995 @@ +import 'package:stem/stem.dart'; +import 'package:test/test.dart'; + +class _GreetingParams { + const _GreetingParams({required this.name}); + + factory _GreetingParams.fromJson(Map json) { + return _GreetingParams(name: json['name']! as String); + } + + final String name; + + Map toJson() => {'name': name}; +} + +class _GreetingResult { + const _GreetingResult({required this.message}); + + factory _GreetingResult.fromJson(Map json) { + return _GreetingResult(message: json['message']! as String); + } + + factory _GreetingResult.fromVersionedJson( + Map json, + int version, + ) { + return _GreetingResult( + message: '${json['message']! as String} v$version', + ); + } + + factory _GreetingResult.fromV2Json(Map json) { + return _GreetingResult( + message: '${json['message']! as String} v2', + ); + } + + factory _GreetingResult.fromVersionedMap( + Map json, + int version, + ) { + return _GreetingResult( + message: '${json['legacy_message']! as String} v$version', + ); + } + + final String message; + + Map toJson() => {'message': message}; +} + +const _greetingParamsCodec = PayloadCodec<_GreetingParams>.json( + decode: _GreetingParams.fromJson, + typeName: '_GreetingParams', +); + +const _greetingResultCodec = PayloadCodec<_GreetingResult>.json( + decode: _GreetingResult.fromJson, + typeName: '_GreetingResult', +); + +const _greetingResultRegistry = PayloadVersionRegistry<_GreetingResult>( + decoders: )>{ + 1: _GreetingResult.fromJson, + 2: _GreetingResult.fromV2Json, + }, + defaultVersion: 1, +); + +class _LegacyGreetingParams { + const _LegacyGreetingParams({required this.name}); + + factory _LegacyGreetingParams.fromVersionedMap( + Map json, + int version, + ) { + return _LegacyGreetingParams( + name: '${json['display_name']! as String} v$version', + ); + } + + final String name; +} + +final _userUpdatedEvent = WorkflowEventRef<_GreetingParams>.json( + topic: 'runtime.ref.event', + decode: _GreetingParams.fromJson, + typeName: '_GreetingParams', +); + +void main() { + group('runtime workflow refs', () { + test('start and wait helpers work directly with WorkflowRuntime', () async { + final flow = Flow( + name: 'runtime.ref.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final workflowRef = flow.ref>( + encodeParams: (params) => params, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final runId = await workflowRef.start( + workflowApp.runtime, + params: const {'name': 'runtime'}, + ); + final waited = await workflowApp.runtime.waitForWorkflowRef( + runId, + workflowRef, + timeout: const Duration(seconds: 2), + ); + + expect(waited?.value, 'hello runtime'); + + final inlineRunId = await workflowApp.runtime.startWorkflowRef( + workflowRef, + const {'name': 'inline'}, + ); + final oneShot = await workflowApp.runtime.waitForCompletion( + inlineRunId, + timeout: const Duration(seconds: 2), + decode: workflowRef.decode, + ); + + expect(oneShot?.value, 'hello inline'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('manual workflow scripts can derive typed refs', () async { + final script = WorkflowScript( + name: 'runtime.ref.script', + run: (context) async { + final name = context.params['name'] as String? ?? 'world'; + return 'script $name'; + }, + ); + final workflowRef = script.ref>( + encodeParams: (params) => params, + ); + + final workflowApp = await StemWorkflowApp.inMemory(scripts: [script]); + try { + await workflowApp.start(); + + final runId = await workflowRef.start( + workflowApp.runtime, + params: const {'name': 'runtime'}, + ); + final waited = await workflowRef.waitFor( + workflowApp.runtime, + runId, + timeout: const Duration(seconds: 2), + ); + + expect(waited?.value, 'script runtime'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('manual workflows can derive codec-backed refs', () async { + final flow = Flow( + name: 'runtime.ref.codec.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final workflowRef = flow.refCodec<_GreetingParams>( + paramsCodec: _greetingParamsCodec, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'codec'), + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'hello codec'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('manual workflows can derive json-backed refs', () async { + final flow = Flow( + name: 'runtime.ref.json.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final workflowRef = flow.refJson<_GreetingParams>(); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'json'), + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'hello json'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('manual workflows can derive versioned-json refs', () async { + final flow = Flow( + name: 'runtime.ref.versioned-json.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.requiredParam('name'); + final version = ctx.requiredParam(PayloadCodec.versionKey); + return 'hello $name v$version'; + }); + }, + ); + final workflowRef = flow.refVersionedJson<_GreetingParams>(version: 2); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'json'), + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'hello json v2'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('manual workflows can derive versioned-map refs', () async { + final flow = Flow( + name: 'runtime.ref.versioned-map.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final params = ctx.paramsVersionedJson<_LegacyGreetingParams>( + decode: _LegacyGreetingParams.fromVersionedMap, + ); + return 'hello ${params.name}'; + }); + }, + ); + final workflowRef = flow.refVersionedMap<_LegacyGreetingParams>( + version: 3, + encodeParams: (params) => {'display_name': params.name}, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _LegacyGreetingParams(name: 'map'), + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'hello map v3'); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'manual workflows can derive json-backed refs with result decoding', + () async { + final flow = Flow( + name: 'runtime.ref.json.ref-result.flow', + build: (builder) { + builder.step( + 'hello', + (ctx) async => const {'message': 'hello ref json'}, + ); + }, + ); + final workflowRef = flow.refJson<_GreetingParams>( + decodeResultJson: _GreetingResult.fromJson, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'ignored'), + timeout: const Duration(seconds: 2), + ); + + expect( + (result?.value as _GreetingResult?)?.message, + 'hello ref json', + ); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'manual workflows can derive json-backed refs with versioned result' + ' decoding', + () async { + final flow = Flow( + name: 'runtime.ref.json.versioned-result.flow', + build: (builder) { + builder.step( + 'hello', + (ctx) async => const { + 'message': 'hello ref json versioned', + PayloadCodec.versionKey: 2, + }, + ); + }, + ); + final workflowRef = flow.refJson<_GreetingParams>( + decodeResultVersionedJson: _GreetingResult.fromVersionedJson, + defaultDecodeVersion: 2, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'ignored'), + timeout: const Duration(seconds: 2), + ); + + expect( + (result?.value as _GreetingResult?)?.message, + 'hello ref json versioned v2', + ); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test('codec-backed refs preserve workflow result decoding', () async { + final flow = Flow<_GreetingResult>.codec( + name: 'runtime.ref.codec.result.flow', + resultCodec: _greetingResultCodec, + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return _GreetingResult(message: 'hello $name'); + }); + }, + ); + final workflowRef = flow.refCodec<_GreetingParams>( + paramsCodec: _greetingParamsCodec, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'codec'), + timeout: const Duration(seconds: 2), + ); + + expect(result?.value?.message, 'hello codec'); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'raw workflow definitions expose direct codec result helpers', + () async { + final flow = WorkflowDefinition<_GreetingResult>.flowCodec( + name: 'runtime.ref.definition.codec.result.flow', + resultCodec: _greetingResultCodec, + build: (builder) { + builder.step( + 'hello', + (ctx) async => const _GreetingResult( + message: 'hello definition flow codec', + ), + ); + }, + ); + final script = WorkflowDefinition<_GreetingResult>.scriptCodec( + name: 'runtime.ref.definition.codec.result.script', + resultCodec: _greetingResultCodec, + run: (context) async => + const _GreetingResult(message: 'hello definition script codec'), + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp.registerWorkflows([flow, script]); + await workflowApp.start(); + + final flowResult = await flow.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + final scriptResult = await script.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + + expect(flowResult?.value?.message, 'hello definition flow codec'); + expect( + scriptResult?.value?.message, + 'hello definition script codec', + ); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test('manual workflows can derive json-backed result decoding', () async { + final flow = Flow<_GreetingResult>.json( + name: 'runtime.ref.json.result.flow', + decodeResult: _GreetingResult.fromJson, + build: (builder) { + builder.step( + 'hello', + (ctx) async => const _GreetingResult(message: 'hello flow json'), + ); + }, + ); + final script = WorkflowScript<_GreetingResult>.json( + name: 'runtime.ref.json.result.script', + decodeResult: _GreetingResult.fromJson, + run: (context) async => + const _GreetingResult(message: 'hello script json'), + ); + + final workflowApp = await StemWorkflowApp.inMemory( + flows: [flow], + scripts: [script], + ); + try { + await workflowApp.start(); + + final flowResult = await flow.startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + final scriptResult = await script.startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + + expect(flowResult?.value?.message, 'hello flow json'); + expect(scriptResult?.value?.message, 'hello script json'); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'raw workflow definitions expose direct json result helpers', + () async { + final flow = WorkflowDefinition<_GreetingResult>.flowJson( + name: 'runtime.ref.definition.json.result.flow', + decodeResult: _GreetingResult.fromJson, + build: (builder) { + builder.step( + 'hello', + (ctx) async => + const _GreetingResult(message: 'hello definition flow json'), + ); + }, + ); + final script = WorkflowDefinition<_GreetingResult>.scriptJson( + name: 'runtime.ref.definition.json.result.script', + decodeResult: _GreetingResult.fromJson, + run: (context) async => + const _GreetingResult(message: 'hello definition script json'), + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp.registerWorkflows([flow, script]); + await workflowApp.start(); + + final flowResult = await flow.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + final scriptResult = await script.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + + expect(flowResult?.value?.message, 'hello definition flow json'); + expect(scriptResult?.value?.message, 'hello definition script json'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'raw workflow definitions expose direct versioned json result helpers', + () async { + final flow = WorkflowDefinition<_GreetingResult>.flowVersionedJson( + name: 'runtime.ref.definition.versioned.result.flow', + version: 2, + decodeResult: _GreetingResult.fromVersionedJson, + build: (builder) { + builder.step( + 'hello', + (ctx) async => const _GreetingResult(message: 'hello flow'), + ); + }, + ); + final script = WorkflowDefinition<_GreetingResult>.scriptVersionedJson( + name: 'runtime.ref.definition.versioned.result.script', + version: 2, + decodeResult: _GreetingResult.fromVersionedJson, + run: (context) async => + const _GreetingResult(message: 'hello script'), + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp.registerWorkflows([flow, script]); + await workflowApp.start(); + + final flowResult = await flow.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + final scriptResult = await script.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + + expect(flowResult?.value?.message, 'hello flow v2'); + expect(scriptResult?.value?.message, 'hello script v2'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'raw workflow definitions expose direct versioned map result helpers', + () async { + final flow = WorkflowDefinition<_GreetingResult>.flowVersionedMap( + name: 'runtime.ref.definition.versioned.map.result.flow', + version: 3, + encodeResult: (value) => {'legacy_message': value.message}, + decodeResult: _GreetingResult.fromVersionedMap, + build: (builder) { + builder.step( + 'hello', + (ctx) async => const _GreetingResult(message: 'hello flow'), + ); + }, + ); + final script = WorkflowDefinition<_GreetingResult>.scriptVersionedMap( + name: 'runtime.ref.definition.versioned.map.result.script', + version: 3, + encodeResult: (value) => {'legacy_message': value.message}, + decodeResult: _GreetingResult.fromVersionedMap, + run: (context) async => + const _GreetingResult(message: 'hello script'), + ); + + final workflowApp = await StemWorkflowApp.inMemory(); + try { + workflowApp.registerWorkflows([flow, script]); + await workflowApp.start(); + + final flowResult = await flow.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + final scriptResult = await script.ref0().startAndWait( + workflowApp.runtime, + timeout: const Duration(seconds: 2), + ); + + expect(flowResult?.value?.message, 'hello flow v3'); + expect(scriptResult?.value?.message, 'hello script v3'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'manual workflows can derive versioned-json refs with result decoding', + () async { + final flow = Flow( + name: 'runtime.ref.versioned-json.ref-result.flow', + build: (builder) { + builder.step( + 'hello', + (ctx) async => const { + 'message': 'hello ref result', + PayloadCodec.versionKey: 2, + }, + ); + }, + ); + final workflowRef = flow.refVersionedJson<_GreetingParams>( + version: 2, + decodeResultVersionedJson: _GreetingResult.fromVersionedJson, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'ignored'), + timeout: const Duration(seconds: 2), + ); + + expect( + (result?.value as _GreetingResult?)?.message, + 'hello ref result v2', + ); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test( + 'manual workflows can derive registry-backed versioned-json refs', + () async { + final flow = Flow( + name: 'runtime.ref.registry.ref-result.flow', + build: (builder) { + builder.step( + 'hello', + (ctx) async => const { + 'message': 'hello ref registry', + PayloadCodec.versionKey: 2, + }, + ); + }, + ); + final workflowRef = flow.refVersionedJsonRegistry<_GreetingParams>( + version: 2, + resultRegistry: _greetingResultRegistry, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final result = await workflowRef.startAndWait( + workflowApp.runtime, + params: const _GreetingParams(name: 'ignored'), + timeout: const Duration(seconds: 2), + ); + + expect( + (result?.value as _GreetingResult?)?.message, + 'hello ref registry v2', + ); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test('manual workflows expose direct no-args helpers', () async { + final flow = Flow( + name: 'runtime.ref.no-args.flow', + build: (builder) { + builder.step('hello', (ctx) async => 'hello flow'); + }, + ); + final script = WorkflowScript( + name: 'runtime.ref.no-args.script', + run: (context) async => 'hello script', + ); + + final workflowApp = await StemWorkflowApp.inMemory( + flows: [flow], + scripts: [script], + ); + try { + await workflowApp.start(); + + final flowResult = await flow.startAndWait( + workflowApp, + timeout: const Duration(seconds: 2), + ); + final scriptRunId = await script.start(workflowApp.runtime); + final scriptResult = await script.waitFor( + workflowApp.runtime, + scriptRunId, + timeout: const Duration(seconds: 2), + ); + + expect(flowResult?.value, 'hello flow'); + expect(scriptResult?.value, 'hello script'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('workflow refs build explicit start calls', () async { + final flow = Flow( + name: 'runtime.ref.builder.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final script = WorkflowScript( + name: 'runtime.ref.builder.script', + run: (context) async => 'hello script', + ); + + final workflowRef = flow.ref>( + encodeParams: (params) => params, + ); + final workflowApp = await StemWorkflowApp.inMemory( + flows: [flow], + scripts: [script], + ); + try { + await workflowApp.start(); + + final builtFlowCall = workflowRef.buildStart( + params: const {'name': 'builder'}, + ttl: const Duration(minutes: 5), + parentRunId: 'parent-builder', + ); + final runId = await workflowApp.runtime.startWorkflowCall( + builtFlowCall, + ); + final result = await workflowRef.waitFor( + workflowApp.runtime, + runId, + timeout: const Duration(seconds: 2), + ); + final state = await workflowApp.getRun(runId); + + expect(builtFlowCall.parentRunId, 'parent-builder'); + expect(builtFlowCall.ttl, const Duration(minutes: 5)); + expect(result?.value, 'hello builder'); + expect(state?.parentRunId, 'parent-builder'); + + final builtScriptCall = script.ref0().asRef.buildStart( + params: (), + cancellationPolicy: const WorkflowCancellationPolicy( + maxRunDuration: Duration(seconds: 5), + ), + ); + final scriptRunId = await workflowApp.runtime.startWorkflowCall( + builtScriptCall, + ); + final oneShot = await builtScriptCall.definition.waitFor( + workflowApp.runtime, + scriptRunId, + timeout: const Duration(seconds: 2), + ); + + expect( + builtScriptCall.cancellationPolicy?.maxRunDuration, + const Duration(seconds: 5), + ); + expect(oneShot?.value, 'hello script'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('workflow refs build explicit workflow start calls', () async { + final flow = Flow( + name: 'runtime.ref.bound.builder.flow', + build: (builder) { + builder.step('hello', (ctx) async { + final name = ctx.params['name'] as String? ?? 'world'; + return 'hello $name'; + }); + }, + ); + final script = WorkflowScript( + name: 'runtime.ref.bound.builder.script', + run: (context) async => 'hello script', + ); + + final workflowRef = flow.ref>( + encodeParams: (params) => params, + ); + final scriptRef = script.ref0(); + + final workflowApp = await StemWorkflowApp.inMemory( + flows: [flow], + scripts: [script], + ); + try { + await workflowApp.start(); + + final builtFlowCall = workflowRef.buildStart( + params: const {'name': 'builder'}, + ttl: const Duration(minutes: 5), + parentRunId: 'parent-bound', + ); + final runId = await workflowApp.runtime.startWorkflowCall( + builtFlowCall, + ); + final result = await workflowRef.waitFor( + workflowApp.runtime, + runId, + timeout: const Duration(seconds: 2), + ); + final state = await workflowApp.getRun(runId); + + expect(builtFlowCall.parentRunId, 'parent-bound'); + expect(builtFlowCall.ttl, const Duration(minutes: 5)); + expect(result?.value, 'hello builder'); + expect(state?.parentRunId, 'parent-bound'); + + final builtScriptCall = scriptRef.asRef.buildStart( + params: (), + cancellationPolicy: const WorkflowCancellationPolicy( + maxRunDuration: Duration(seconds: 5), + ), + ); + final scriptRunId = await workflowApp.runtime.startWorkflowCall( + builtScriptCall, + ); + final oneShot = await builtScriptCall.definition.waitFor( + workflowApp.runtime, + scriptRunId, + timeout: const Duration(seconds: 2), + ); + + expect( + builtScriptCall.cancellationPolicy?.maxRunDuration, + const Duration(seconds: 5), + ); + expect(oneShot?.value, 'hello script'); + } finally { + await workflowApp.shutdown(); + } + }); + + test('typed workflow events emit directly from the event ref', () async { + final flow = Flow( + name: 'runtime.ref.event.flow', + build: (builder) { + builder.step('wait', (ctx) async { + final payload = _userUpdatedEvent.waitValue(ctx); + if (payload == null) { + return null; + } + return 'hello ${payload.name}'; + }); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final runId = await flow.ref0().start(workflowApp); + await workflowApp.runtime.executeRun(runId); + + await _userUpdatedEvent.emit( + workflowApp, + const _GreetingParams(name: 'event'), + ); + await workflowApp.runtime.executeRun(runId); + + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'hello event'); + } finally { + await workflowApp.shutdown(); + } + }); + + test( + 'typed workflow event calls emit from the prebuilt call surface', + () async { + final flow = Flow( + name: 'runtime.ref.event.call.flow', + build: (builder) { + builder.step('wait', (ctx) async { + final payload = await _userUpdatedEvent.wait(ctx); + return 'hello ${payload.name}'; + }); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final runId = await flow.ref0().start(workflowApp); + await workflowApp.runtime.executeRun(runId); + + await _userUpdatedEvent.emit( + workflowApp, + const _GreetingParams(name: 'call'), + ); + await workflowApp.runtime.executeRun(runId); + + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'hello call'); + } finally { + await workflowApp.shutdown(); + } + }, + ); + + test('workflow event emitters expose bound event calls', () async { + final flow = Flow( + name: 'runtime.ref.event.bound.flow', + build: (builder) { + builder.step('wait', (ctx) async { + final payload = _userUpdatedEvent.waitValue(ctx); + if (payload == null) { + return null; + } + return 'hello ${payload.name}'; + }); + }, + ); + + final workflowApp = await StemWorkflowApp.inMemory(flows: [flow]); + try { + await workflowApp.start(); + + final runId = await flow.ref0().start(workflowApp); + await workflowApp.runtime.executeRun(runId); + + expect(_userUpdatedEvent.topic, 'runtime.ref.event'); + + await _userUpdatedEvent.emit( + workflowApp, + const _GreetingParams(name: 'bound'), + ); + await workflowApp.runtime.executeRun(runId); + + final result = await workflowApp.waitForCompletion( + runId, + timeout: const Duration(seconds: 2), + ); + + expect(result?.value, 'hello bound'); + } finally { + await workflowApp.shutdown(); + } + }); + }); +} diff --git a/packages/stem/test/workflow/workflow_runtime_test.dart b/packages/stem/test/workflow/workflow_runtime_test.dart index c04b1486..9add7da0 100644 --- a/packages/stem/test/workflow/workflow_runtime_test.dart +++ b/packages/stem/test/workflow/workflow_runtime_test.dart @@ -92,6 +92,10 @@ void main() { state.workflowParams.containsKey(workflowRuntimeMetadataParamKey), isFalse, ); + expect( + state.workflowParams.containsKey(workflowParentRunIdParamKey), + isFalse, + ); expect(introspection.runtimeEvents, isNotEmpty); expect( introspection.runtimeEvents.last.type, @@ -100,7 +104,234 @@ void main() { }, ); - test('viewRunDetail exposes uniform run and step views', () async { + test( + 'startWorkflow persists parent run id without exposing it to handlers', + () async { + runtime.registerWorkflow( + Flow( + name: 'parent.runtime.workflow', + build: (flow) { + flow.step('inspect', (context) async => context.params); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow( + 'parent.runtime.workflow', + parentRunId: 'wf-parent', + params: const {'tenant': 'acme'}, + ); + + final state = await store.get(runId); + expect(state, isNotNull); + expect(state!.parentRunId, 'wf-parent'); + expect(state.workflowParams, equals(const {'tenant': 'acme'})); + expect( + state.params[workflowParentRunIdParamKey], + equals('wf-parent'), + ); + }, + ); + + test('flow context workflows starts typed child workflows', () async { + final childRef = WorkflowRef, String>( + name: 'child.runtime.flow', + encodeParams: (params) => params, + ); + + runtime + ..registerWorkflow( + Flow( + name: 'child.runtime.flow', + build: (flow) { + flow.step('hello', (context) async { + final value = context.params['value'] as String? ?? 'child'; + return 'ok:$value'; + }); + }, + ).definition, + ) + ..registerWorkflow( + Flow( + name: 'parent.runtime.flow', + build: (flow) { + flow.step('spawn', (context) async { + return childRef.start( + context, + params: const {'value': 'spawned'}, + ); + }); + }, + ).definition, + ); + + final parentRunId = await runtime.startWorkflow('parent.runtime.flow'); + await runtime.executeRun(parentRunId); + + final parentState = await store.get(parentRunId); + final childRunId = parentState!.result! as String; + final childState = await store.get(childRunId); + + expect(childState, isNotNull); + expect(childState!.workflow, 'child.runtime.flow'); + expect(childState.parentRunId, parentRunId); + expect(childState.workflowParams, equals(const {'value': 'spawned'})); + }); + + test('script checkpoint workflows starts typed child workflows', () async { + final childRef = WorkflowRef, String>( + name: 'child.runtime.script', + encodeParams: (params) => params, + ); + + runtime + ..registerWorkflow( + Flow( + name: 'child.runtime.script', + build: (flow) { + flow.step('hello', (context) async { + final value = context.params['value'] as String? ?? 'child'; + return 'ok:$value'; + }); + }, + ).definition, + ) + ..registerWorkflow( + WorkflowScript( + name: 'parent.runtime.script', + checkpoints: [ + WorkflowCheckpoint(name: 'spawn'), + ], + run: (script) async { + return script.step('spawn', (context) async { + return childRef.start( + context, + params: const {'value': 'script-child'}, + ); + }); + }, + ).definition, + ); + + final parentRunId = await runtime.startWorkflow('parent.runtime.script'); + await runtime.executeRun(parentRunId); + + final parentState = await store.get(parentRunId); + final childRunId = parentState!.result! as String; + final childState = await store.get(childRunId); + + expect(childState, isNotNull); + expect(childState!.workflow, 'child.runtime.script'); + expect(childState.parentRunId, parentRunId); + expect(childState.workflowParams, equals(const {'value': 'script-child'})); + }); + + test( + 'flow contexts can startAndWait for child workflows directly', + () async { + final childRef = WorkflowRef, String>( + name: 'child.runtime.wait.flow', + encodeParams: (params) => params, + ); + + runtime + ..registerWorkflow( + Flow( + name: 'child.runtime.wait.flow', + build: (flow) { + flow.step('hello', (context) async { + final value = context.params['value'] as String? ?? 'child'; + return 'ok:$value'; + }); + }, + ).definition, + ) + ..registerWorkflow( + Flow( + name: 'parent.runtime.wait.flow', + build: (flow) { + flow.step('spawn', (context) async { + final childResult = await childRef.startAndWait( + context, + params: const {'value': 'spawned'}, + timeout: const Duration(seconds: 2), + ); + return { + 'childRunId': childResult?.runId, + 'childValue': childResult?.value, + }; + }); + }, + ).definition, + ); + + final parentRunId = await runtime.startWorkflow( + 'parent.runtime.wait.flow', + ); + await runtime.executeRun(parentRunId); + + final parentState = await store.get(parentRunId); + final result = Map.from(parentState!.result! as Map); + expect(result['childRunId'], isA()); + expect(result['childValue'], 'ok:spawned'); + }, + ); + + test( + 'script checkpoints can startAndWait for child workflows directly', + () async { + final childRef = WorkflowRef, String>( + name: 'child.runtime.wait.script', + encodeParams: (params) => params, + ); + + runtime + ..registerWorkflow( + Flow( + name: 'child.runtime.wait.script', + build: (flow) { + flow.step('hello', (context) async { + final value = context.params['value'] as String? ?? 'child'; + return 'ok:$value'; + }); + }, + ).definition, + ) + ..registerWorkflow( + WorkflowScript>( + name: 'parent.runtime.wait.script', + checkpoints: [WorkflowCheckpoint(name: 'spawn')], + run: (script) async { + return script.step>('spawn', ( + context, + ) async { + final childResult = await childRef.startAndWait( + context, + params: const {'value': 'script-child'}, + timeout: const Duration(seconds: 2), + ); + return { + 'childRunId': childResult?.runId, + 'childValue': childResult?.value, + }; + }); + }, + ).definition, + ); + + final parentRunId = await runtime.startWorkflow( + 'parent.runtime.wait.script', + ); + await runtime.executeRun(parentRunId); + + final parentState = await store.get(parentRunId); + final result = Map.from(parentState!.result! as Map); + expect(result['childRunId'], isA()); + expect(result['childValue'], 'ok:script-child'); + }, + ); + + test('viewRunDetail exposes uniform run and checkpoint views', () async { runtime.registerWorkflow( Flow( name: 'views.workflow', @@ -117,9 +348,9 @@ void main() { expect(detail, isNotNull); expect(detail!.run.runId, equals(runId)); expect(detail.run.workflow, equals('views.workflow')); - expect(detail.steps, hasLength(1)); - expect(detail.steps.first.baseStepName, equals('only')); - expect(detail.steps.first.stepName, equals('only')); + expect(detail.checkpoints, hasLength(1)); + expect(detail.checkpoints.first.baseCheckpointName, equals('only')); + expect(detail.checkpoints.first.checkpointName, equals('only')); }); test('workflowManifest exposes typed manifest entries', () { @@ -287,6 +518,46 @@ void main() { expect(completed?.result, 'resumed'); }); + test('sleepFor suspends and resumes without manual guards', () async { + runtime.registerWorkflow( + Flow( + name: 'sleep.expression.workflow', + build: (flow) { + flow + ..step('wait', (context) async { + await context.sleepFor( + duration: const Duration(milliseconds: 20), + ); + return 'slept'; + }) + ..step( + 'complete', + (context) async => '${context.previousResult}-done', + ); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow('sleep.expression.workflow'); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.resumeAt, isNotNull); + + clock.advance(const Duration(milliseconds: 30)); + final due = await store.dueRuns(clock.now()); + for (final id in due) { + final state = await store.get(id); + await store.markResumed(id, data: state?.suspensionData); + await runtime.executeRun(id); + } + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(completed?.result, 'slept-done'); + }); + test('awaitEvent suspends and resumes with payload', () async { String? observedPayload; @@ -323,6 +594,40 @@ void main() { expect(observedPayload, 'user-123'); }); + test('waitForEvent suspends and resumes with payload', () async { + String? observedPayload; + + runtime.registerWorkflow( + Flow( + name: 'event.expression.workflow', + build: (flow) { + flow.step('wait', (context) async { + final payload = await context.waitForEvent>( + topic: 'user.updated.expression', + ); + observedPayload = payload['id'] as String?; + return payload['id']; + }); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow('event.expression.workflow'); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, 'user.updated.expression'); + + await runtime.emit('user.updated.expression', const {'id': 'user-789'}); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(observedPayload, 'user-789'); + expect(completed?.result, 'user-789'); + }); + test('emitValue resumes flows with codec-backed DTO payloads', () async { _UserUpdatedEvent? observedPayload; @@ -368,6 +673,301 @@ void main() { expect(completed?.result, 'user-typed-1'); }); + test( + 'emitJson resumes flows with DTO payloads without a manual map', + () async { + _UserUpdatedEvent? observedPayload; + + runtime.registerWorkflow( + Flow( + name: 'event.json.workflow', + build: (flow) { + flow.step( + 'wait', + (context) async { + final resume = context.takeResumeValue<_UserUpdatedEvent>( + codec: _userUpdatedEventCodec, + ); + if (resume == null) { + context.awaitEvent('user.updated.json'); + return null; + } + observedPayload = resume; + return resume.id; + }, + ); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow('event.json.workflow'); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, 'user.updated.json'); + + await runtime.emitJson( + 'user.updated.json', + const _UserUpdatedEvent(id: 'user-json-1'), + ); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(observedPayload?.id, 'user-json-1'); + expect(completed?.result, 'user-json-1'); + }, + ); + + test( + 'emitVersionedJson resumes flows with versioned DTO payloads', + () async { + _UserUpdatedEvent? observedPayload; + + runtime.registerWorkflow( + Flow( + name: 'event.versioned.json.workflow', + build: (flow) { + flow.step( + 'wait', + (context) async { + final resume = context.takeResumeValue<_UserUpdatedEvent>( + codec: _userUpdatedEventCodec, + ); + if (resume == null) { + context.awaitEvent('user.updated.versioned.json'); + return null; + } + observedPayload = resume; + return resume.id; + }, + ); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow( + 'event.versioned.json.workflow', + ); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, 'user.updated.versioned.json'); + + await runtime.emitVersionedJson( + 'user.updated.versioned.json', + const _UserUpdatedEvent(id: 'user-json-2'), + version: 2, + ); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(observedPayload?.id, 'user-json-2'); + expect(completed?.result, 'user-json-2'); + }, + ); + + test('emitEvent resumes flows with typed workflow event refs', () async { + final event = WorkflowEventRef<_UserUpdatedEvent>.codec( + topic: 'user.updated.ref', + codec: _userUpdatedEventCodec, + ); + _UserUpdatedEvent? observedPayload; + + runtime.registerWorkflow( + Flow( + name: 'event.ref.workflow', + build: (flow) { + flow.step( + 'wait', + (context) async { + final resume = event.waitValue(context); + if (resume == null) { + return null; + } + observedPayload = resume; + return resume.id; + }, + ); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow('event.ref.workflow'); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, event.topic); + + await event.emit( + runtime, + const _UserUpdatedEvent(id: 'user-typed-2'), + ); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(observedPayload?.id, 'user-typed-2'); + expect(completed?.result, 'user-typed-2'); + }); + + test( + 'emitEvent resumes flows with versioned-json workflow event refs', + () async { + final event = WorkflowEventRef<_UserUpdatedEvent>.versionedJson( + topic: 'user.updated.versioned.ref', + version: 2, + decode: _UserUpdatedEvent.fromVersionedJson, + typeName: '_UserUpdatedEvent', + ); + _UserUpdatedEvent? observedPayload; + + runtime.registerWorkflow( + Flow( + name: 'event.versioned.ref.workflow', + build: (flow) { + flow.step( + 'wait', + (context) async { + final resume = event.waitValue(context); + if (resume == null) { + return null; + } + observedPayload = resume; + return resume.id; + }, + ); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow('event.versioned.ref.workflow'); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, event.topic); + + await event.emit( + runtime, + const _UserUpdatedEvent(id: 'user-versioned-ref-2'), + ); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(observedPayload?.id, 'user-versioned-ref-2'); + expect(completed?.result, 'user-versioned-ref-2'); + }, + ); + + test( + 'emitEvent resumes flows with registry-backed workflow event refs', + () async { + final event = WorkflowEventRef<_UserUpdatedEvent>.versionedJsonRegistry( + topic: 'user.updated.registry.ref', + version: 2, + registry: _userUpdatedEventRegistry, + typeName: '_UserUpdatedEvent', + ); + _UserUpdatedEvent? observedPayload; + + runtime.registerWorkflow( + Flow( + name: 'event.registry.ref.workflow', + build: (flow) { + flow.step( + 'wait', + (context) async { + final resume = event.waitValue(context); + if (resume == null) { + return null; + } + observedPayload = resume; + return resume.id; + }, + ); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow('event.registry.ref.workflow'); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, event.topic); + + await event.emit( + runtime, + const _UserUpdatedEvent(id: 'user-registry-ref-2'), + ); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(observedPayload?.id, 'user-registry-ref-2'); + expect(completed?.result, 'user-registry-ref-2'); + }, + ); + + test( + 'emitEvent resumes flows with versioned-map workflow event refs', + () async { + final event = WorkflowEventRef<_UserUpdatedEvent>.versionedMap( + topic: 'user.updated.versioned.map.ref', + encode: (value) => {'user_id': value.id}, + version: 3, + decode: _UserUpdatedEvent.fromVersionedMap, + typeName: '_UserUpdatedEvent', + ); + _UserUpdatedEvent? observedPayload; + + runtime.registerWorkflow( + Flow( + name: 'event.versioned.map.ref.workflow', + build: (flow) { + flow.step( + 'wait', + (context) async { + final resume = event.waitValue(context); + if (resume == null) { + return null; + } + observedPayload = resume; + return resume.id; + }, + ); + }, + ).definition, + ); + + final runId = await runtime.startWorkflow( + 'event.versioned.map.ref.workflow', + ); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, event.topic); + + await event.emit( + runtime, + const _UserUpdatedEvent(id: 'user-versioned-map-ref'), + ); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(observedPayload?.id, 'user-versioned-map-ref-v3'); + expect(completed?.result, 'user-versioned-map-ref-v3'); + }, + ); + test('emit persists payload before worker resumes execution', () async { runtime.registerWorkflow( Flow( @@ -742,6 +1342,47 @@ void main() { expect(completed?.result, 'user-42'); }); + test( + 'script waitForEvent uses named args and resumes with payload', + () async { + Map? resumePayload; + + runtime.registerWorkflow( + WorkflowScript( + name: 'script.event.expression', + run: (script) async { + final result = await script.step('wait', (step) async { + final payload = await step.waitForEvent>( + topic: 'user.updated.expression.script', + ); + resumePayload = payload; + return payload['id']; + }); + return result; + }, + ).definition, + ); + + final runId = await runtime.startWorkflow('script.event.expression'); + await runtime.executeRun(runId); + + final suspended = await store.get(runId); + expect(suspended?.status, WorkflowStatus.suspended); + expect(suspended?.waitTopic, 'user.updated.expression.script'); + + await runtime.emit( + 'user.updated.expression.script', + const {'id': 'user-43'}, + ); + await runtime.executeRun(runId); + + final completed = await store.get(runId); + expect(completed?.status, WorkflowStatus.completed); + expect(resumePayload?['id'], 'user-43'); + expect(completed?.result, 'user-43'); + }, + ); + test('script autoVersion step persists sequential checkpoints', () async { final iterations = []; @@ -912,9 +1553,7 @@ void main() { name: 'meta.workflow', build: (flow) { flow.step('dispatch', (context) async { - final enqueuer = context.enqueuer; - expect(enqueuer, isNotNull); - await enqueuer!.enqueue( + await context.enqueue( taskName, meta: const {'custom': 'value'}, ); @@ -1012,6 +1651,8 @@ void main() { }, ).definition, ); + // Keep this direct call form; cascading a single registration is noisier. + // ignore: cascade_invocations runtime.registerWorkflow( Flow( name: 'logging.complete.workflow', @@ -1090,10 +1731,11 @@ void main() { name: 'meta.builder.workflow', build: (flow) { flow.step('dispatch', (context) async { - await TaskEnqueueBuilder( - definition: definition, - args: const {}, - ).meta('origin', 'builder').enqueueWith(stem); + final call = definition.buildCall( + const {}, + meta: const {'origin': 'builder'}, + ); + await stem.enqueueCall(call); return 'done'; }); }, @@ -1148,20 +1790,47 @@ class _RecordingLogDriver extends LogDriver { } } -final _userUpdatedEventCodec = PayloadCodec<_UserUpdatedEvent>( - encode: (value) => value.toJson(), +const _userUpdatedEventCodec = PayloadCodec<_UserUpdatedEvent>.json( decode: _UserUpdatedEvent.fromJson, + typeName: '_UserUpdatedEvent', ); class _UserUpdatedEvent { const _UserUpdatedEvent({required this.id}); - final String id; + factory _UserUpdatedEvent.fromJson(Map json) { + return _UserUpdatedEvent(id: json['id']! as String); + } - Map toJson() => {'id': id}; + factory _UserUpdatedEvent.fromVersionedJson( + Map json, + int version, + ) { + expect(version, 2); + return _UserUpdatedEvent(id: json['id'] as String); + } - static _UserUpdatedEvent fromJson(Object? payload) { - final json = payload! as Map; + factory _UserUpdatedEvent.fromV2Json(Map json) { return _UserUpdatedEvent(id: json['id'] as String); } + + factory _UserUpdatedEvent.fromVersionedMap( + Map json, + int version, + ) { + expect(version, 3); + return _UserUpdatedEvent(id: '${json['user_id'] as String}-v$version'); + } + + final String id; + + Map toJson() => {'id': id}; } + +const _userUpdatedEventRegistry = PayloadVersionRegistry<_UserUpdatedEvent>( + decoders: )>{ + 1: _UserUpdatedEvent.fromJson, + 2: _UserUpdatedEvent.fromV2Json, + }, + defaultVersion: 1, +); diff --git a/packages/stem/tool/proxy_runtime_check.dart b/packages/stem/tool/proxy_runtime_check.dart index c33db2b8..9c0ef764 100644 --- a/packages/stem/tool/proxy_runtime_check.dart +++ b/packages/stem/tool/proxy_runtime_check.dart @@ -3,7 +3,11 @@ import 'dart:io'; import 'package:stem/stem.dart'; class ScriptDef { - Future run(WorkflowScriptContext script) async { + Future run({WorkflowScriptContext? context}) async { + assert( + context == null || context.runId.isNotEmpty, + 'workflow context should carry a runId', + ); return sendEmail('user@example.com'); } @@ -42,7 +46,7 @@ Future main() async { runtime.registerWorkflow( WorkflowScript( name: 'proxy.script', - run: (script) => ScriptProxy(script).run(script), + run: (script) => ScriptProxy(script).run(context: script), ).definition, ); @@ -50,10 +54,12 @@ Future main() async { await runtime.executeRun(runId); final detail = await runtime.viewRunDetail(runId); stdout.writeln( - 'result=${detail?.run.result} checkpoints=${detail?.steps.length}', + 'result=${detail?.run.result} checkpoints=${detail?.checkpoints.length}', ); - if ((detail?.steps.length ?? 0) > 0) { - stdout.writeln('checkpointName=${detail!.steps.first.stepName}'); + if ((detail?.checkpoints.length ?? 0) > 0) { + stdout.writeln( + 'checkpointName=${detail!.checkpoints.first.checkpointName}', + ); } await runtime.dispose(); diff --git a/packages/stem_adapter_tests/pubspec.yaml b/packages/stem_adapter_tests/pubspec.yaml index 0819484c..01f091c6 100644 --- a/packages/stem_adapter_tests/pubspec.yaml +++ b/packages/stem_adapter_tests/pubspec.yaml @@ -7,7 +7,7 @@ environment: sdk: ">=3.9.2 <4.0.0" dependencies: - stem: ^0.1.1 + stem: ^0.2.0 test: ^1.29.0 dev_dependencies: diff --git a/packages/stem_builder/CHANGELOG.md b/packages/stem_builder/CHANGELOG.md index aab805a1..001006de 100644 --- a/packages/stem_builder/CHANGELOG.md +++ b/packages/stem_builder/CHANGELOG.md @@ -1,20 +1,24 @@ # Changelog +## 0.2.0 + +- Generated output now centers on `stemModule`, `StemWorkflowDefinitions`, and + `StemTaskDefinitions` with the same narrowed happy-path APIs as `stem` + itself. +- Generated child-workflow examples and docs now prefer direct + `start(...)` / `startAndWait(...)` helpers in durable boundaries, leaving + explicit transport objects as the advanced path. +- Generated DTO payload codecs now use the shorter `json(...)`, `map(...)`, + and registry-backed versioned factories where appropriate. +- Builder diagnostics now catch duplicate/conflicting workflow checkpoint + names and redundant manual `script.step(...)` wrappers around annotated + checkpoints, including context-aware cases. +- Annotated workflow/task generation now supports shared execution contexts, + direct typed starter output, and bundle-first bootstrap guidance that aligns + with the `StemClient`-first runtime model. + ## 0.1.0 -- Switched generated output to a bundle-first surface with `stemModule`, `StemWorkflowDefinitions`, `StemTaskDefinitions`, generated typed wait helpers, and payload codec generation for DTO-backed workflow/task APIs. -- Added builder diagnostics for duplicate or conflicting annotated workflow checkpoint names and refreshed generated examples around typed workflow refs. -- Added typed workflow starter generation and app helper output for annotated - workflow/task definitions. -- Switched generated output to per-file `part` generation using `.stem.g.dart` - files instead of a shared standalone registry file. -- Added support for plain `@WorkflowRun` entrypoints and configurable starter - naming in generated APIs. -- Refreshed the builder README, example package, and annotated workflow demos - to match the generated `tasks:`-first runtime wiring. -- Switched generated script metadata from `steps:` to `checkpoints:` and - expanded docs/examples around direct step calls, context injection, and - serializable parameter rules. - Initial builder for annotated Stem workflow/task registries. - Expanded the registry builder implementation and hardened generation output. - Added build configuration, analysis options, and tests for registry builds. diff --git a/packages/stem_builder/README.md b/packages/stem_builder/README.md index c4a1797a..6f99afa2 100644 --- a/packages/stem_builder/README.md +++ b/packages/stem_builder/README.md @@ -54,19 +54,20 @@ class HelloScript { @TaskDefn(name: 'hello.task') Future helloTask( - TaskInvocationContext context, String email, + {TaskExecutionContext? context} ) async { // ... } ``` -Script workflows can use a plain `run(...)` method (no extra annotation -required). `@WorkflowRun` is still supported for backward compatibility. -`run(...)` may optionally take `WorkflowScriptContext` as its first parameter, -followed by required positional serializable parameters. +Script workflows can use a plain `run(...)` method with no extra annotation. +When you need runtime metadata, add an optional named +`WorkflowScriptContext? context` parameter. The direct annotated checkpoint +call still stays the default path. -The intended usage is to call annotated step methods directly from `run(...)`: +The intended usage is to call annotated checkpoint methods directly from +`run(...)`: ```dart Future> run(String email) async { @@ -87,23 +88,39 @@ Conceptually: - script workflows: `run(...)` is the execution plan, and declared checkpoints are metadata for manifests/tooling -Choose the entry shape based on whether you need step context: +Script workflows use one entry model: -- plain direct-call style - - `Future run(String email, ...)` - - use when annotated step methods only need serializable parameters -- context-aware style - - `@WorkflowRun()` - - `Future run(WorkflowScriptContext script, String email, ...)` - - use when you need to enter through `script.step(...)` so the step body can - receive `WorkflowScriptStepContext` +- start with a plain direct-call `run(String email, ...)` +- add an optional named injected context when you need runtime metadata + - `Future run(String email, {WorkflowScriptContext? context})` + - `Future checkpoint(String email, {WorkflowExecutionContext? context})` +- direct annotated checkpoint calls stay the default path Supported context injection points: -- flow steps: `FlowContext` +- flow steps: `FlowContext` or `WorkflowExecutionContext` - script runs: `WorkflowScriptContext` -- script steps: `WorkflowScriptStepContext` -- tasks: `TaskInvocationContext` +- script checkpoints: `WorkflowScriptStepContext` or + `WorkflowExecutionContext` +- tasks: `TaskExecutionContext` + +Durable workflow execution contexts enqueue tasks directly: + +- `WorkflowExecutionContext.enqueue(...)` +- typed task definitions can target those contexts via `enqueue(...)` + +Child workflows should be started from durable boundaries: + +- `ref.start(context, params: value)` inside flow steps +- `ref.startAndWait(context, params: value)` inside script checkpoints +- pass `ttl:`, `parentRunId:`, or `cancellationPolicy:` directly to + `ref.start(...)` / `ref.startAndWait(...)` for the normal override cases +- build an explicit transport request with `ref.buildStart(...)` only for the + rarer low-level cases where you need to pass a `WorkflowStartCall` around + +Avoid starting child workflows directly from the raw +`WorkflowScriptContext` body unless you are explicitly handling replay +semantics yourself. Serializable parameter rules are enforced by the generator: @@ -115,7 +132,7 @@ Serializable parameter rules are enforced by the generator: - Dart classes with `toJson()` plus a named `fromJson(...)` constructor taking `Map` - unsupported directly: - - optional/named parameters on generated workflow/task entrypoints + - optional/named business parameters on generated workflow/task entrypoints Typed task results can use the same DTO convention. @@ -131,8 +148,8 @@ The intended DX is: - pass generated `stemModule` into `StemWorkflowApp` or `StemClient` - start workflows through generated workflow refs instead of raw workflow-name strings -- enqueue annotated tasks through generated `enqueueXxx(...)` helpers instead - of raw task-name strings +- enqueue annotated tasks through generated task definitions instead of raw + task-name strings You can customize generated workflow ref names via `@WorkflowDefn`: @@ -154,18 +171,25 @@ Run build_runner to generate `*.stem.g.dart` part files: dart run build_runner build ``` -The generated part exports a bundle plus typed helpers so you can avoid raw -workflow-name and task-name strings (for example -`StemWorkflowDefinitions.userSignup.call((email: 'user@example.com'))` or -`stem.enqueueBuilderExampleTask(args: {...})`). +The generated part exports a bundle plus typed refs/definitions so you can +avoid raw workflow-name and task-name strings (for example +`StemWorkflowDefinitions.userSignup.start( +workflowApp, +params: 'user@example.com', +)` +or `StemTaskDefinitions.builderExamplePing.enqueue(stem)`). Generated output includes: - `stemModule` - `StemWorkflowDefinitions` - `StemTaskDefinitions` -- typed enqueue helpers on `TaskEnqueuer` -- typed result wait helpers on `Stem` +- typed `TaskDefinition` objects whose advanced explicit transport path uses + `TaskCall`, alongside direct `enqueue(...)` / `enqueueAndWait(...)` + +Generated task definitions are producer-safe. `Stem.enqueueCall(...)` can use +the definition metadata directly, so a producer can publish typed task calls +without registering the worker handler locally first. ## Wiring Into StemWorkflowApp @@ -177,26 +201,54 @@ final workflowApp = await StemWorkflowApp.fromUrl( module: stemModule, ); -final result = await StemWorkflowDefinitions.userSignup - .call((email: 'user@example.com')) - .startAndWaitWithApp(workflowApp); +final result = await StemWorkflowDefinitions.userSignup.startAndWait( + workflowApp, + params: 'user@example.com', +); ``` +When you use `module: stemModule`, the workflow app infers the worker +subscription from the workflow queue plus the default queues declared on the +bundled task handlers. Override `workerConfig.subscription` only when your +routing sends work to additional queues. + If your application already owns a `StemApp`, reuse it: ```dart final stemApp = await StemApp.fromUrl( 'redis://localhost:6379', adapters: const [StemRedisAdapter()], - tasks: stemModule.tasks, + module: stemModule, + workerConfig: StemWorkerConfig( + queue: 'workflow', + subscription: RoutingSubscription( + queues: ['workflow', 'default'], + ), + ), ); -final workflowApp = await StemWorkflowApp.create( - stemApp: stemApp, +final workflowApp = await stemApp.createWorkflowApp(); +``` + +That shared-app path only works when the existing `StemApp` worker already +subscribes to the workflow queue plus any task queues the workflows need. +If you want subscription inference, prefer `StemClient.createWorkflowApp()`. + +For task-only services, use the same bundle directly with `StemApp`: + +```dart +final taskApp = await StemApp.fromUrl( + 'redis://localhost:6379', + adapters: const [StemRedisAdapter()], module: stemModule, ); ``` +Plain `StemApp` bootstrap also infers task queue subscriptions from the +bundled or explicitly supplied task handlers when +`workerConfig.subscription` is omitted, and it lazy-starts on the first +enqueue or wait call. + If you already centralize wiring in a `StemClient`, prefer the shared-client path: @@ -204,26 +256,34 @@ path: final client = await StemClient.fromUrl( 'redis://localhost:6379', adapters: const [StemRedisAdapter()], + module: stemModule, ); -final workflowApp = await client.createWorkflowApp(module: stemModule); +final workflowApp = await client.createWorkflowApp(); ``` -The generated workflow refs work on `WorkflowRuntime` too: +If you reuse an existing `StemApp`, its worker subscription stays in charge. +Workflow-side queue inference only applies when `StemWorkflowApp` is creating +the worker for you. + +When you are intentionally using the low-level `WorkflowRuntime`, the +generated workflow refs work there too: ```dart final runtime = workflowApp.runtime; -final runId = await StemWorkflowDefinitions.userSignup - .call((email: 'user@example.com')) - .startWithRuntime(runtime); -await runtime.executeRun(runId); +final runId = await StemWorkflowDefinitions.userSignup.start( + runtime, + params: 'user@example.com', +); +await workflowApp.executeRun(runId); ``` -Annotated tasks also get generated definitions and enqueue helpers: +Annotated tasks also get generated definitions: ```dart -final taskId = await workflowApp.app.stem.enqueueBuilderExampleTask( - args: const {'kind': 'welcome'}, +final taskId = await StemTaskDefinitions.builderExampleTask.enqueue( + workflowApp, + const {'kind': 'welcome'}, ); ``` @@ -233,5 +293,5 @@ See [`example/README.md`](example/README.md) for runnable examples, including: - Generated registration + execution with `StemWorkflowApp` - Runtime manifest + run detail views with `WorkflowRuntime` -- Plain direct-call script steps and context-aware script steps -- Typed `@TaskDefn` parameters with `TaskInvocationContext` +- Plain direct-call script checkpoints and context-aware script checkpoints +- Typed `@TaskDefn` parameters with `TaskExecutionContext` diff --git a/packages/stem_builder/example/README.md b/packages/stem_builder/example/README.md index afca9303..78e7f039 100644 --- a/packages/stem_builder/example/README.md +++ b/packages/stem_builder/example/README.md @@ -5,9 +5,12 @@ This example demonstrates: - Annotated workflow/task definitions - Generated `stemModule` - Generated typed workflow refs (no manual workflow-name strings): - - `StemWorkflowDefinitions.flow.call(...).startWithRuntime(runtime)` - - `StemWorkflowDefinitions.userSignup.call(...).startWithRuntime(runtime)` -- Generated typed task definitions, enqueue helpers, and typed result wait helpers + - `StemWorkflowDefinitions.flow.start(runtime, params: (...))` + - `StemWorkflowDefinitions.userSignup.start(runtime, params: (...))` +- Generated typed task definitions that use the shared `TaskCall` / + `TaskDefinition.waitFor(...)` APIs +- Generated zero-arg task definitions with direct helpers from core: + - `StemTaskDefinitions.builderExamplePing.enqueueAndWait(stem)` - Generated workflow manifest via `stemModule.workflowManifest` - Running generated definitions through `StemWorkflowApp` - Runtime manifest + run/step metadata views via `WorkflowRuntime` @@ -34,5 +37,6 @@ The generated bundle is the default integration surface: - `StemWorkflowApp.inMemory(module: stemModule)` - `StemWorkflowApp.fromUrl(..., module: stemModule)` -- `StemWorkflowApp.create(stemApp: ..., module: stemModule)` -- `StemClient.createWorkflowApp(module: stemModule)` +- `stemApp.createWorkflowApp()` when the shared app already covers the workflow + queue +- `StemClient.fromUrl(..., module: stemModule)` + `createWorkflowApp()` diff --git a/packages/stem_builder/example/bin/main.dart b/packages/stem_builder/example/bin/main.dart index 0d14ada8..9d4481af 100644 --- a/packages/stem_builder/example/bin/main.dart +++ b/packages/stem_builder/example/bin/main.dart @@ -18,18 +18,18 @@ Future main() async { final app = await StemWorkflowApp.inMemory(module: stemModule); try { - final runtime = app.runtime; - final runtimeManifest = runtime + final runtimeManifest = app .workflowManifest() .map((entry) => entry.toJson()) .toList(growable: false); print('\nRuntime manifest:'); print(const JsonEncoder.withIndent(' ').convert(runtimeManifest)); - final runId = await StemWorkflowDefinitions.flow - .call(const {'name': 'Stem Builder'}) - .startWithRuntime(runtime); - await runtime.executeRun(runId); + final runId = await StemWorkflowDefinitions.flow.start( + app, + params: 'Stem Builder', + ); + await app.executeRun(runId); final result = await StemWorkflowDefinitions.flow.waitFor( app, runId, @@ -39,4 +39,16 @@ Future main() async { } finally { await app.close(); } + + final taskApp = await StemApp.inMemory(module: stemModule); + try { + final taskResult = await StemTaskDefinitions.builderExamplePing + .enqueueAndWait( + taskApp, + timeout: const Duration(seconds: 2), + ); + print('\nNo-arg task result: ${taskResult?.value}'); + } finally { + await taskApp.shutdown(); + } } diff --git a/packages/stem_builder/example/bin/runtime_metadata_views.dart b/packages/stem_builder/example/bin/runtime_metadata_views.dart index 1aacdee4..1c81bc83 100644 --- a/packages/stem_builder/example/bin/runtime_metadata_views.dart +++ b/packages/stem_builder/example/bin/runtime_metadata_views.dart @@ -25,17 +25,20 @@ Future main() async { ), ); - final flowRunId = await StemWorkflowDefinitions.flow - .call(const {'name': 'runtime metadata'}) - .startWithRuntime(runtime); - await runtime.executeRun(flowRunId); + final flowRunId = await StemWorkflowDefinitions.flow.start( + runtime, + params: 'runtime metadata', + ); + await app.executeRun(flowRunId); final scriptRunId = await StemWorkflowDefinitions.userSignup - .call((email: 'dev@stem.dev')) - .startWithRuntime(runtime); - await runtime.executeRun(scriptRunId); + .start( + runtime, + params: 'dev@stem.dev', + ); + await app.executeRun(scriptRunId); - final runViews = await runtime.listRunViews(limit: 10); + final runViews = await app.listRunViews(limit: 10); print('\n--- Run views ---'); print( const JsonEncoder.withIndent( @@ -43,8 +46,8 @@ Future main() async { ).convert(runViews.map((view) => view.toJson()).toList()), ); - final flowDetail = await runtime.viewRunDetail(flowRunId); - final scriptDetail = await runtime.viewRunDetail(scriptRunId); + final flowDetail = await app.viewRunDetail(flowRunId); + final scriptDetail = await app.viewRunDetail(scriptRunId); print('\n--- Flow run detail ---'); print(const JsonEncoder.withIndent(' ').convert(flowDetail?.toJson())); diff --git a/packages/stem_builder/example/lib/definitions.dart b/packages/stem_builder/example/lib/definitions.dart index b8980ede..ac0f464e 100644 --- a/packages/stem_builder/example/lib/definitions.dart +++ b/packages/stem_builder/example/lib/definitions.dart @@ -36,3 +36,6 @@ Future builderExampleTask( TaskInvocationContext context, Map args, ) async {} + +@TaskDefn(name: 'builder.example.ping') +Future builderPingTask() async => 'pong'; diff --git a/packages/stem_builder/example/lib/definitions.stem.g.dart b/packages/stem_builder/example/lib/definitions.stem.g.dart index 2f87ebb1..ede891a6 100644 --- a/packages/stem_builder/example/lib/definitions.stem.g.dart +++ b/packages/stem_builder/example/lib/definitions.stem.g.dart @@ -8,7 +8,7 @@ final List _stemFlows = [ name: "builder.example.flow", build: (flow) { final impl = BuilderExampleFlow(); - flow.step( + flow.step( "greet", (ctx) => impl.greet((_stemRequireArg(ctx.params, "name") as String)), kind: WorkflowStepKind.task, @@ -50,21 +50,18 @@ final List _stemScripts = [ WorkflowScript( name: "builder.example.user_signup", checkpoints: [ - FlowStep( + WorkflowCheckpoint( name: "create-user", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "send-welcome-email", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), - FlowStep( + WorkflowCheckpoint( name: "send-one-week-check-in-email", - handler: _stemScriptManifestStepNoop, kind: WorkflowStepKind.task, taskNames: [], ), @@ -76,20 +73,17 @@ final List _stemScripts = [ ]; abstract final class StemWorkflowDefinitions { - static final WorkflowRef, String> flow = - WorkflowRef, String>( - name: "builder.example.flow", - encodeParams: (params) => params, - ); - static final WorkflowRef<({String email}), Map> userSignup = - WorkflowRef<({String email}), Map>( + static final WorkflowRef flow = WorkflowRef( + name: "builder.example.flow", + encodeParams: (params) => {"name": params}, + ); + static final WorkflowRef> userSignup = + WorkflowRef>( name: "builder.example.user_signup", - encodeParams: (params) => {"email": params.email}, + encodeParams: (params) => {"email": params}, ); } -Future _stemScriptManifestStepNoop(FlowContext context) async => null; - Object? _stemRequireArg(Map args, String name) { if (!args.containsKey(name)) { throw ArgumentError('Missing required argument "$name".'); @@ -97,6 +91,13 @@ Object? _stemRequireArg(Map args, String name) { return args[name]; } +Future _stemTaskAdapter0( + TaskInvocationContext context, + Map args, +) async { + return await Future.value(builderPingTask()); +} + abstract final class StemTaskDefinitions { static final TaskDefinition, Object?> builderExampleTask = TaskDefinition, Object?>( @@ -105,41 +106,12 @@ abstract final class StemTaskDefinitions { defaultOptions: const TaskOptions(), metadata: const TaskMetadata(), ); -} - -extension StemGeneratedTaskEnqueuer on TaskEnqueuer { - Future enqueueBuilderExampleTask({ - required Map args, - Map headers = const {}, - TaskOptions? options, - DateTime? notBefore, - Map? meta, - TaskEnqueueOptions? enqueueOptions, - }) { - return enqueueCall( - StemTaskDefinitions.builderExampleTask.call( - args, - headers: headers, - options: options, - notBefore: notBefore, - meta: meta, - enqueueOptions: enqueueOptions, - ), - ); - } -} - -extension StemGeneratedTaskResults on Stem { - Future?> waitForBuilderExampleTask( - String taskId, { - Duration? timeout, - }) { - return waitForTaskDefinition( - taskId, - StemTaskDefinitions.builderExampleTask, - timeout: timeout, - ); - } + static final NoArgsTaskDefinition builderExamplePing = + NoArgsTaskDefinition( + name: "builder.example.ping", + defaultOptions: const TaskOptions(), + metadata: const TaskMetadata(), + ); } final List> _stemTasks = >[ @@ -149,6 +121,12 @@ final List> _stemTasks = >[ options: const TaskOptions(), metadata: const TaskMetadata(), ), + FunctionTaskHandler( + name: "builder.example.ping", + entrypoint: _stemTaskAdapter0, + options: const TaskOptions(), + metadata: const TaskMetadata(), + ), ]; final List _stemWorkflowManifest = diff --git a/packages/stem_builder/lib/src/stem_registry_builder.dart b/packages/stem_builder/lib/src/stem_registry_builder.dart index 1b715cbd..c0e10e3e 100644 --- a/packages/stem_builder/lib/src/stem_registry_builder.dart +++ b/packages/stem_builder/lib/src/stem_registry_builder.dart @@ -44,6 +44,10 @@ class StemRegistryBuilder implements Builder { FlowContext, inPackage: 'stem', ); + const workflowExecutionContextChecker = TypeChecker.typeNamed( + WorkflowExecutionContext, + inPackage: 'stem', + ); const scriptContextChecker = TypeChecker.typeNamed( WorkflowScriptContext, inPackage: 'stem', @@ -53,7 +57,7 @@ class StemRegistryBuilder implements Builder { inPackage: 'stem', ); const taskContextChecker = TypeChecker.typeNamed( - TaskInvocationContext, + TaskExecutionContext, inPackage: 'stem', ); const mapChecker = TypeChecker.typeNamed(Map, inSdk: true); @@ -177,6 +181,7 @@ class StemRegistryBuilder implements Builder { final stepBinding = _validateScriptStepMethod( method, scriptStepContextChecker, + workflowExecutionContextChecker, ); final stepAnnotation = workflowStepChecker.firstAnnotationOfExact( method, @@ -200,8 +205,12 @@ class StemRegistryBuilder implements Builder { _WorkflowStepInfo( name: stepName, method: method.displayName, - acceptsFlowContext: false, - acceptsScriptStepContext: stepBinding.acceptsContext, + flowContextParameterName: null, + flowContextIsNamed: false, + flowContextTypeCode: null, + scriptStepContextParameterName: stepBinding.contextParameterName, + scriptStepContextIsNamed: stepBinding.contextIsNamed, + scriptStepContextTypeCode: stepBinding.contextTypeCode, valueParameters: stepBinding.valueParameters, returnTypeCode: stepBinding.returnTypeCode, stepValueTypeCode: stepBinding.stepValueTypeCode, @@ -225,7 +234,7 @@ class StemRegistryBuilder implements Builder { classElement, runMethod, scriptSteps, - runAcceptsScriptContext: runBinding.acceptsContext, + runAcceptsScriptContext: runBinding.contextParameterName != null, ); workflows.add( _WorkflowInfo.script( @@ -234,7 +243,8 @@ class StemRegistryBuilder implements Builder { className: classElement.displayName, steps: scriptSteps, runMethod: runMethod.displayName, - runAcceptsScriptContext: runBinding.acceptsContext, + runContextParameterName: runBinding.contextParameterName, + runContextIsNamed: runBinding.contextIsNamed, runValueParameters: runBinding.valueParameters, resultTypeCode: runBinding.resultTypeCode, resultPayloadCodecTypeCode: runBinding.resultPayloadCodecTypeCode, @@ -265,6 +275,7 @@ class StemRegistryBuilder implements Builder { final stepBinding = _validateFlowStepMethod( method, flowContextChecker, + workflowExecutionContextChecker, ); final stepAnnotation = workflowStepChecker.firstAnnotationOfExact( method, @@ -288,8 +299,12 @@ class StemRegistryBuilder implements Builder { _WorkflowStepInfo( name: stepName, method: method.displayName, - acceptsFlowContext: stepBinding.acceptsContext, - acceptsScriptStepContext: false, + flowContextParameterName: stepBinding.contextParameterName, + flowContextIsNamed: stepBinding.contextIsNamed, + flowContextTypeCode: stepBinding.contextTypeCode, + scriptStepContextParameterName: null, + scriptStepContextIsNamed: false, + scriptStepContextTypeCode: null, valueParameters: stepBinding.valueParameters, returnTypeCode: null, stepValueTypeCode: stepBinding.stepValueTypeCode, @@ -310,8 +325,9 @@ class StemRegistryBuilder implements Builder { importAlias: '', className: classElement.displayName, steps: steps, - resultTypeCode: - steps.isEmpty ? 'Object?' : (steps.last.stepValueTypeCode ?? 'Object?'), + resultTypeCode: steps.isEmpty + ? 'Object?' + : (steps.last.stepValueTypeCode ?? 'Object?'), resultPayloadCodecTypeCode: steps.isEmpty ? null : steps.last.stepValuePayloadCodecTypeCode, @@ -368,10 +384,12 @@ class StemRegistryBuilder implements Builder { name: taskName, importAlias: '', function: function.displayName, - adapterName: taskBinding.usesLegacyMapArgs + adapterName: + taskBinding.usesLegacyMapArgs && !taskBinding.contextIsNamed ? null : '_stemTaskAdapter${taskAdapterIndex++}', - acceptsTaskContext: taskBinding.acceptsContext, + taskContextParameterName: taskBinding.contextParameterName, + taskContextIsNamed: taskBinding.contextIsNamed, valueParameters: taskBinding.valueParameters, usesLegacyMapArgs: taskBinding.usesLegacyMapArgs, resultTypeCode: taskBinding.resultTypeCode, @@ -432,16 +450,19 @@ class StemRegistryBuilder implements Builder { } final parameters = method.formalParameters; - var acceptsContext = false; - var startIndex = 0; - if (parameters.isNotEmpty && - scriptContextChecker.isAssignableFromType(parameters.first.type)) { - acceptsContext = true; - startIndex = 1; - } + final contextParameter = _extractInjectedContextParameter( + parameters, + [scriptContextChecker], + method, + annotationLabel: '@workflow.run method', + contextTypeLabel: 'WorkflowScriptContext', + ); final valueParameters = <_ValueParameterInfo>[]; - for (final parameter in parameters.skip(startIndex)) { + for (final parameter in parameters) { + if (identical(parameter, contextParameter?.parameter)) { + continue; + } if (!parameter.isRequiredPositional) { throw InvalidGenerationSourceError( '@workflow.run method ${method.displayName} only supports required positional serializable or codec-backed parameters after WorkflowScriptContext.', @@ -459,7 +480,8 @@ class StemRegistryBuilder implements Builder { } return _RunBinding( - acceptsContext: acceptsContext, + contextParameterName: contextParameter?.name, + contextIsNamed: contextParameter?.isNamed ?? false, valueParameters: valueParameters, resultTypeCode: _workflowResultTypeCode(method.returnType), resultPayloadCodecTypeCode: _workflowResultPayloadCodecTypeCode( @@ -471,6 +493,7 @@ class StemRegistryBuilder implements Builder { static _FlowStepBinding _validateFlowStepMethod( MethodElement method, TypeChecker flowContextChecker, + TypeChecker workflowExecutionContextChecker, ) { if (method.isPrivate) { throw InvalidGenerationSourceError( @@ -479,19 +502,22 @@ class StemRegistryBuilder implements Builder { ); } final parameters = method.formalParameters; - var acceptsContext = false; - var startIndex = 0; - if (parameters.isNotEmpty && - flowContextChecker.isAssignableFromType(parameters.first.type)) { - acceptsContext = true; - startIndex = 1; - } + final contextParameter = _extractInjectedContextParameter( + parameters, + [flowContextChecker, workflowExecutionContextChecker], + method, + annotationLabel: '@workflow.step method', + contextTypeLabel: 'FlowContext or WorkflowExecutionContext', + ); final valueParameters = <_ValueParameterInfo>[]; - for (final parameter in parameters.skip(startIndex)) { + for (final parameter in parameters) { + if (identical(parameter, contextParameter?.parameter)) { + continue; + } if (!parameter.isRequiredPositional) { throw InvalidGenerationSourceError( - '@workflow.step method ${method.displayName} only supports required positional serializable or codec-backed parameters after FlowContext.', + '@workflow.step method ${method.displayName} only supports required positional serializable or codec-backed parameters after FlowContext or WorkflowExecutionContext.', element: method, ); } @@ -506,7 +532,9 @@ class StemRegistryBuilder implements Builder { } return _FlowStepBinding( - acceptsContext: acceptsContext, + contextParameterName: contextParameter?.name, + contextIsNamed: contextParameter?.isNamed ?? false, + contextTypeCode: contextParameter?.typeCode, valueParameters: valueParameters, stepValueTypeCode: _workflowResultTypeCode(method.returnType), stepValuePayloadCodecTypeCode: _workflowResultPayloadCodecTypeCode( @@ -518,6 +546,7 @@ class StemRegistryBuilder implements Builder { static _ScriptStepBinding _validateScriptStepMethod( MethodElement method, TypeChecker scriptStepContextChecker, + TypeChecker workflowExecutionContextChecker, ) { if (method.isPrivate) { throw InvalidGenerationSourceError( @@ -537,19 +566,22 @@ class StemRegistryBuilder implements Builder { final stepValueType = _extractStepValueType(returnType); final parameters = method.formalParameters; - var acceptsContext = false; - var startIndex = 0; - if (parameters.isNotEmpty && - scriptStepContextChecker.isAssignableFromType(parameters.first.type)) { - acceptsContext = true; - startIndex = 1; - } + final contextParameter = _extractInjectedContextParameter( + parameters, + [scriptStepContextChecker, workflowExecutionContextChecker], + method, + annotationLabel: '@workflow.step method', + contextTypeLabel: 'WorkflowScriptStepContext or WorkflowExecutionContext', + ); final valueParameters = <_ValueParameterInfo>[]; - for (final parameter in parameters.skip(startIndex)) { + for (final parameter in parameters) { + if (identical(parameter, contextParameter?.parameter)) { + continue; + } if (!parameter.isRequiredPositional) { throw InvalidGenerationSourceError( - '@workflow.step method ${method.displayName} only supports required positional serializable or codec-backed parameters after WorkflowScriptStepContext.', + '@workflow.step method ${method.displayName} only supports required positional serializable or codec-backed parameters after WorkflowScriptStepContext or WorkflowExecutionContext.', element: method, ); } @@ -564,7 +596,9 @@ class StemRegistryBuilder implements Builder { } return _ScriptStepBinding( - acceptsContext: acceptsContext, + contextParameterName: contextParameter?.name, + contextIsNamed: contextParameter?.isNamed ?? false, + contextTypeCode: contextParameter?.typeCode, valueParameters: valueParameters, returnTypeCode: _typeCode(returnType), stepValueTypeCode: _typeCode(stepValueType), @@ -578,24 +612,29 @@ class StemRegistryBuilder implements Builder { TypeChecker mapChecker, ) { final parameters = function.formalParameters; - var acceptsContext = false; - var startIndex = 0; - if (parameters.isNotEmpty && - taskContextChecker.isAssignableFromType(parameters.first.type)) { - acceptsContext = true; - startIndex = 1; - } + final contextParameter = _extractInjectedContextParameter( + parameters, + [taskContextChecker], + function, + annotationLabel: '@TaskDefn function', + contextTypeLabel: 'TaskExecutionContext', + ); - final remaining = parameters.skip(startIndex).toList(growable: false); + final remaining = parameters + .where( + (parameter) => !identical(parameter, contextParameter?.parameter), + ) + .toList(growable: false); final legacyMapSignature = - acceptsContext && + contextParameter != null && remaining.length == 1 && mapChecker.isAssignableFromType(remaining.first.type) && _isStringObjectMap(remaining.first.type) && remaining.first.isRequiredPositional; if (legacyMapSignature) { return _TaskBinding( - acceptsContext: true, + contextParameterName: contextParameter.name, + contextIsNamed: contextParameter.isNamed, valueParameters: [], usesLegacyMapArgs: true, resultTypeCode: _taskResultTypeCode(function.returnType), @@ -609,7 +648,7 @@ class StemRegistryBuilder implements Builder { for (final parameter in remaining) { if (!parameter.isRequiredPositional) { throw InvalidGenerationSourceError( - '@TaskDefn function ${function.displayName} only supports required positional serializable or codec-backed parameters after TaskInvocationContext.', + '@TaskDefn function ${function.displayName} only supports required positional serializable or codec-backed parameters after TaskExecutionContext.', element: function, ); } @@ -624,7 +663,8 @@ class StemRegistryBuilder implements Builder { } return _TaskBinding( - acceptsContext: acceptsContext, + contextParameterName: contextParameter?.name, + contextIsNamed: contextParameter?.isNamed ?? false, valueParameters: valueParameters, usesLegacyMapArgs: false, resultTypeCode: _taskResultTypeCode(function.returnType), @@ -649,6 +689,69 @@ class StemRegistryBuilder implements Builder { ); } + static _InjectedContextParameter? _extractInjectedContextParameter( + List parameters, + List checkers, + Element element, { + required String annotationLabel, + required String contextTypeLabel, + }) { + _InjectedContextParameter? contextParameter; + if (parameters.isNotEmpty && + parameters.first.isRequiredPositional && + _matchesAnyContextType(checkers, parameters.first.type)) { + contextParameter = _InjectedContextParameter( + parameter: parameters.first, + name: parameters.first.displayName, + isNamed: false, + typeCode: _typeCode(parameters.first.type), + ); + } + + for (final parameter in parameters.skip( + contextParameter == null ? 0 : 1, + )) { + if (!_matchesAnyContextType(checkers, parameter.type)) { + continue; + } + if (contextParameter != null) { + throw InvalidGenerationSourceError( + '$annotationLabel ${element.displayName} may declare at most one ' + '$contextTypeLabel parameter.', + element: element, + ); + } + if (!parameter.isNamed || parameter.isRequiredNamed) { + throw InvalidGenerationSourceError( + '$annotationLabel ${element.displayName} must declare ' + '$contextTypeLabel as the first positional parameter or an ' + 'optional named parameter.', + element: element, + ); + } + contextParameter = _InjectedContextParameter( + parameter: parameter, + name: parameter.displayName, + isNamed: true, + typeCode: _typeCode(parameter.type), + ); + } + + return contextParameter; + } + + static bool _matchesAnyContextType( + List checkers, + DartType type, + ) { + for (final checker in checkers) { + if (checker.isAssignableFromType(type)) { + return true; + } + } + return false; + } + static String _taskResultTypeCode(DartType returnType) { final valueType = _extractAsyncValueType(returnType); if (valueType is VoidType || valueType is NeverType) { @@ -825,7 +928,8 @@ class _WorkflowInfo { this.metadata, }) : kind = WorkflowKind.flow, runMethod = null, - runAcceptsScriptContext = false, + runContextParameterName = null, + runContextIsNamed = false, runValueParameters = const []; _WorkflowInfo.script({ @@ -834,7 +938,8 @@ class _WorkflowInfo { required this.className, required this.steps, required this.runMethod, - required this.runAcceptsScriptContext, + required this.runContextParameterName, + required this.runContextIsNamed, required this.runValueParameters, required this.resultTypeCode, required this.resultPayloadCodecTypeCode, @@ -853,7 +958,8 @@ class _WorkflowInfo { final String resultTypeCode; final String? resultPayloadCodecTypeCode; final String? runMethod; - final bool runAcceptsScriptContext; + final String? runContextParameterName; + final bool runContextIsNamed; final List<_ValueParameterInfo> runValueParameters; final String? starterNameOverride; final String? nameFieldOverride; @@ -866,8 +972,12 @@ class _WorkflowStepInfo { const _WorkflowStepInfo({ required this.name, required this.method, - required this.acceptsFlowContext, - required this.acceptsScriptStepContext, + required this.flowContextParameterName, + required this.flowContextIsNamed, + required this.flowContextTypeCode, + required this.scriptStepContextParameterName, + required this.scriptStepContextIsNamed, + required this.scriptStepContextTypeCode, required this.valueParameters, required this.returnTypeCode, required this.stepValueTypeCode, @@ -881,8 +991,12 @@ class _WorkflowStepInfo { final String name; final String method; - final bool acceptsFlowContext; - final bool acceptsScriptStepContext; + final String? flowContextParameterName; + final bool flowContextIsNamed; + final String? flowContextTypeCode; + final String? scriptStepContextParameterName; + final bool scriptStepContextIsNamed; + final String? scriptStepContextTypeCode; final List<_ValueParameterInfo> valueParameters; final String? returnTypeCode; final String? stepValueTypeCode; @@ -892,6 +1006,10 @@ class _WorkflowStepInfo { final DartObject? kind; final DartObject? taskNames; final DartObject? metadata; + + bool get acceptsFlowContext => flowContextParameterName != null; + + bool get acceptsScriptStepContext => scriptStepContextParameterName != null; } void _ensureUniqueWorkflowStepNames( @@ -959,7 +1077,7 @@ Future _diagnoseScriptCheckpointPatterns( for (final methodName in invocation.annotatedMethodCalls) { final step = stepsByMethod[methodName]; - if (step == null || step.acceptsScriptStepContext) { + if (step == null) { continue; } final wrapperName = invocation.stepName ?? ''; @@ -993,7 +1111,8 @@ class _ManualScriptStepVisitor extends RecursiveAstVisitor { @override void visitMethodInvocation(MethodInvocation node) { - if (node.methodName.name == 'step' && node.argumentList.arguments.length >= 2) { + if (node.methodName.name == 'step' && + node.argumentList.arguments.length >= 2) { final nameArg = node.argumentList.arguments.first; final callbackArg = node.argumentList.arguments[1]; final callback = callbackArg is FunctionExpression ? callbackArg : null; @@ -1047,7 +1166,8 @@ class _TaskInfo { required this.importAlias, required this.function, required this.adapterName, - required this.acceptsTaskContext, + required this.taskContextParameterName, + required this.taskContextIsNamed, required this.valueParameters, required this.usesLegacyMapArgs, required this.resultTypeCode, @@ -1061,7 +1181,8 @@ class _TaskInfo { final String importAlias; final String function; final String? adapterName; - final bool acceptsTaskContext; + final String? taskContextParameterName; + final bool taskContextIsNamed; final List<_ValueParameterInfo> valueParameters; final bool usesLegacyMapArgs; final String resultTypeCode; @@ -1069,17 +1190,23 @@ class _TaskInfo { final DartObject? options; final DartObject? metadata; final bool runInIsolate; + + bool get acceptsTaskContext => taskContextParameterName != null; } class _FlowStepBinding { const _FlowStepBinding({ - required this.acceptsContext, + required this.contextParameterName, + required this.contextIsNamed, + required this.contextTypeCode, required this.valueParameters, required this.stepValueTypeCode, required this.stepValuePayloadCodecTypeCode, }); - final bool acceptsContext; + final String? contextParameterName; + final bool contextIsNamed; + final String? contextTypeCode; final List<_ValueParameterInfo> valueParameters; final String stepValueTypeCode; final String? stepValuePayloadCodecTypeCode; @@ -1087,13 +1214,15 @@ class _FlowStepBinding { class _RunBinding { const _RunBinding({ - required this.acceptsContext, + required this.contextParameterName, + required this.contextIsNamed, required this.valueParameters, required this.resultTypeCode, required this.resultPayloadCodecTypeCode, }); - final bool acceptsContext; + final String? contextParameterName; + final bool contextIsNamed; final List<_ValueParameterInfo> valueParameters; final String resultTypeCode; final String? resultPayloadCodecTypeCode; @@ -1101,14 +1230,18 @@ class _RunBinding { class _ScriptStepBinding { const _ScriptStepBinding({ - required this.acceptsContext, + required this.contextParameterName, + required this.contextIsNamed, + required this.contextTypeCode, required this.valueParameters, required this.returnTypeCode, required this.stepValueTypeCode, required this.stepValuePayloadCodecTypeCode, }); - final bool acceptsContext; + final String? contextParameterName; + final bool contextIsNamed; + final String? contextTypeCode; final List<_ValueParameterInfo> valueParameters; final String returnTypeCode; final String stepValueTypeCode; @@ -1117,14 +1250,16 @@ class _ScriptStepBinding { class _TaskBinding { const _TaskBinding({ - required this.acceptsContext, + required this.contextParameterName, + required this.contextIsNamed, required this.valueParameters, required this.usesLegacyMapArgs, required this.resultTypeCode, required this.resultPayloadCodecTypeCode, }); - final bool acceptsContext; + final String? contextParameterName; + final bool contextIsNamed; final List<_ValueParameterInfo> valueParameters; final bool usesLegacyMapArgs; final String resultTypeCode; @@ -1143,6 +1278,20 @@ class _ValueParameterInfo { final String? payloadCodecTypeCode; } +class _InjectedContextParameter { + const _InjectedContextParameter({ + required this.parameter, + required this.name, + required this.isNamed, + required this.typeCode, + }); + + final FormalParameterElement parameter; + final String name; + final bool isNamed; + final String typeCode; +} + class _RegistryEmitter { _RegistryEmitter({ required this.workflows, @@ -1240,42 +1389,17 @@ class _RegistryEmitter { if (payloadCodecSymbols.isEmpty) { return; } - buffer.writeln('Map _stemPayloadMap('); - buffer.writeln(' Object? value,'); - buffer.writeln(' String typeName,'); - buffer.writeln(') {'); - buffer.writeln(' if (value is Map) {'); - buffer.writeln(' return Map.from(value);'); - buffer.writeln(' }'); - buffer.writeln(' if (value is Map) {'); - buffer.writeln(' final result = {};'); - buffer.writeln(' value.forEach((key, entry) {'); - buffer.writeln(' if (key is! String) {'); - buffer.writeln( - r" throw StateError('$typeName payload must use string keys.');", - ); - buffer.writeln(' }'); - buffer.writeln(' result[key] = entry;'); - buffer.writeln(' });'); - buffer.writeln(' return result;'); - buffer.writeln(' }'); - buffer.writeln( - r" throw StateError('$typeName payload must decode to Map, got ${value.runtimeType}.');", - ); - buffer.writeln('}'); - buffer.writeln(); - buffer.writeln('abstract final class StemPayloadCodecs {'); for (final entry in payloadCodecSymbols.entries) { final typeCode = entry.key; final symbol = entry.value; buffer.writeln(' static final PayloadCodec<$typeCode> $symbol ='); - buffer.writeln(' PayloadCodec<$typeCode>('); - buffer.writeln(' encode: (value) => value.toJson(),'); + buffer.writeln(' PayloadCodec<$typeCode>.json('); buffer.writeln( - ' decode: (payload) => $typeCode.fromJson(' - ' _stemPayloadMap(payload, ${_string(typeCode)}),' - ' ),', + ' decode: $typeCode.fromJson,', + ); + buffer.writeln( + ' typeName: ${_string(typeCode)},', ); buffer.writeln(' );'); } @@ -1313,11 +1437,17 @@ class _RegistryEmitter { for (final step in workflow.steps) { final stepArgs = step.valueParameters .map((param) => _decodeArg('ctx.params', param)) - .join(', '); - final invocationArgs = [ - if (step.acceptsFlowContext) 'ctx', - if (stepArgs.isNotEmpty) stepArgs, - ].join(', '); + .toList(growable: false); + final invocationArgs = _invocationArgs( + positional: [ + if (step.acceptsFlowContext && !step.flowContextIsNamed) 'ctx', + ...stepArgs, + ], + named: { + if (step.acceptsFlowContext && step.flowContextIsNamed) + step.flowContextParameterName!: 'ctx', + }, + ); buffer.writeln(' flow.step<${step.stepValueTypeCode}>('); buffer.writeln(' ${_string(step.name)},'); buffer.writeln( @@ -1372,25 +1502,38 @@ class _RegistryEmitter { buffer.writeln(' $proxyClassName(this._script);'); buffer.writeln(' final WorkflowScriptContext _script;'); for (final step in workflow.steps) { - final signatureParts = [ - if (step.acceptsScriptStepContext) - 'WorkflowScriptStepContext context', - ...step.valueParameters.map( - (parameter) => '${parameter.typeCode} ${parameter.name}', - ), - ]; - final invocationArgs = [ - if (step.acceptsScriptStepContext) 'context', - ...step.valueParameters.map((parameter) => parameter.name), - ]; + final signature = _methodSignature( + positional: [ + if (step.acceptsScriptStepContext && !step.scriptStepContextIsNamed) + '${step.scriptStepContextTypeCode!} ${step.scriptStepContextParameterName!}', + ...step.valueParameters.map( + (parameter) => '${parameter.typeCode} ${parameter.name}', + ), + ], + named: [ + if (step.acceptsScriptStepContext && step.scriptStepContextIsNamed) + '${step.scriptStepContextTypeCode!} ${step.scriptStepContextParameterName!}', + ], + ); + final invocationArgs = _invocationArgs( + positional: [ + if (step.acceptsScriptStepContext && !step.scriptStepContextIsNamed) + 'context', + ...step.valueParameters.map((parameter) => parameter.name), + ], + named: { + if (step.acceptsScriptStepContext && step.scriptStepContextIsNamed) + step.scriptStepContextParameterName!: 'context', + }, + ); buffer.writeln(' @override'); buffer.writeln( - ' ${step.returnTypeCode} ${step.method}(${signatureParts.join(', ')}) {', + ' ${step.returnTypeCode} ${step.method}($signature) {', ); buffer.writeln(' return _script.step<${step.stepValueTypeCode}>('); buffer.writeln(' ${_string(step.name)},'); buffer.writeln( - ' (context) => super.${step.method}(${invocationArgs.join(', ')}),', + ' (context) => super.${step.method}($invocationArgs),', ); if (step.autoVersion) { buffer.writeln(' autoVersion: true,'); @@ -1413,14 +1556,13 @@ class _RegistryEmitter { buffer.writeln(' checkpoints: ['); for (final step in workflow.steps) { if (step.stepValuePayloadCodecTypeCode != null) { - buffer.writeln(' FlowStep.typed<${step.stepValueTypeCode}>('); + buffer.writeln( + ' WorkflowCheckpoint.typed<${step.stepValueTypeCode}>(', + ); } else { - buffer.writeln(' FlowStep('); + buffer.writeln(' WorkflowCheckpoint('); } buffer.writeln(' name: ${_string(step.name)},'); - buffer.writeln( - ' handler: _stemScriptManifestStepNoop,', - ); if (step.autoVersion) { buffer.writeln(' autoVersion: true,'); } @@ -1468,22 +1610,40 @@ class _RegistryEmitter { buffer.writeln(' resultCodec: StemPayloadCodecs.$codecField,'); } if (proxyClass != null) { - final runArgs = [ - if (workflow.runAcceptsScriptContext) 'script', - ...workflow.runValueParameters.map( - (parameter) => _decodeArg('script.params', parameter), - ), - ].join(', '); + final runArgs = _invocationArgs( + positional: [ + if (workflow.runContextParameterName != null && + !workflow.runContextIsNamed) + 'script', + ...workflow.runValueParameters.map( + (parameter) => _decodeArg('script.params', parameter), + ), + ], + named: { + if (workflow.runContextParameterName != null && + workflow.runContextIsNamed) + workflow.runContextParameterName!: 'script', + }, + ); buffer.writeln( ' run: (script) => $proxyClass(script).${workflow.runMethod}($runArgs),', ); } else { - final runArgs = [ - if (workflow.runAcceptsScriptContext) 'script', - ...workflow.runValueParameters.map( - (parameter) => _decodeArg('script.params', parameter), - ), - ].join(', '); + final runArgs = _invocationArgs( + positional: [ + if (workflow.runContextParameterName != null && + !workflow.runContextIsNamed) + 'script', + ...workflow.runValueParameters.map( + (parameter) => _decodeArg('script.params', parameter), + ), + ], + named: { + if (workflow.runContextParameterName != null && + workflow.runContextIsNamed) + workflow.runContextParameterName!: 'script', + }, + ); buffer.writeln( ' run: (script) => ${_qualify(workflow.importAlias, workflow.className)}().${workflow.runMethod}($runArgs),', ); @@ -1498,37 +1658,45 @@ class _RegistryEmitter { if (workflows.isEmpty) { return; } + final symbolNames = _symbolNamesForWorkflows(workflows); final fieldNames = _fieldNamesForWorkflows( workflows, - _symbolNamesForWorkflows(workflows), + symbolNames, ); buffer.writeln('abstract final class StemWorkflowDefinitions {'); for (final workflow in workflows) { final fieldName = fieldNames[workflow]!; final argsTypeCode = _workflowArgsTypeCode(workflow); - buffer.writeln( - ' static final WorkflowRef<$argsTypeCode, ${workflow.resultTypeCode}> ' - '$fieldName = WorkflowRef<$argsTypeCode, ${workflow.resultTypeCode}>(', - ); + final valueParameters = workflow.kind == WorkflowKind.script + ? workflow.runValueParameters + : workflow.steps.first.valueParameters; + final usesNoArgsDefinition = valueParameters.isEmpty; + final singleParameter = _singleValueParameter(valueParameters); + final refType = usesNoArgsDefinition + ? 'NoArgsWorkflowRef<${workflow.resultTypeCode}>' + : 'WorkflowRef<$argsTypeCode, ${workflow.resultTypeCode}>'; + final constructorType = usesNoArgsDefinition + ? 'NoArgsWorkflowRef<${workflow.resultTypeCode}>' + : 'WorkflowRef<$argsTypeCode, ${workflow.resultTypeCode}>'; + buffer.writeln(' static final $refType $fieldName = $constructorType('); buffer.writeln(' name: ${_string(workflow.name)},'); - if (workflow.kind == WorkflowKind.script) { - if (workflow.runValueParameters.isEmpty) { + if (!usesNoArgsDefinition) { + buffer.writeln(' encodeParams: (params) => {'); + if (singleParameter != null) { buffer.writeln( - ' encodeParams: (_) => const {},', + ' ${_string(singleParameter.name)}: ' + '${_encodeValueExpression('params', singleParameter)},', ); } else { - buffer.writeln(' encodeParams: (params) => {'); - for (final parameter in workflow.runValueParameters) { + for (final parameter in valueParameters) { buffer.writeln( ' ${_string(parameter.name)}: ' '${_encodeValueExpression('params.${parameter.name}', parameter)},', ); } - buffer.writeln(' },'); } - } else { - buffer.writeln(' encodeParams: (params) => params,'); + buffer.writeln(' },'); } if (workflow.resultPayloadCodecTypeCode != null) { final codecField = @@ -1714,14 +1882,39 @@ class _RegistryEmitter { return _lowerCamel(pascal); } + String _invocationArgs({ + List positional = const [], + Map named = const {}, + }) { + final parts = [ + ...positional.where((part) => part.isNotEmpty), + ...named.entries + .where((entry) => entry.value.isNotEmpty) + .map((entry) => '${entry.key}: ${entry.value}'), + ]; + return parts.join(', '); + } + + String _methodSignature({ + List positional = const [], + List named = const [], + }) { + final parts = [ + ...positional.where((part) => part.isNotEmpty), + ]; + if (named.isNotEmpty) { + parts.add('{${named.join(', ')}}'); + } + return parts.join(', '); + } + void _emitTasks(StringBuffer buffer) { buffer.writeln( 'final List> _stemTasks = >[', ); for (final task in tasks) { - final entrypoint = task.usesLegacyMapArgs - ? _qualify(task.importAlias, task.function) - : task.adapterName!; + final entrypoint = + task.adapterName ?? _qualify(task.importAlias, task.function); final metadataCode = _taskMetadataCode(task); buffer.writeln(' FunctionTaskHandler('); buffer.writeln(' name: ${_string(task.name)},'); @@ -1750,26 +1943,42 @@ class _RegistryEmitter { for (final task in tasks) { final symbol = _lowerCamel(symbolNames[task]!); final argsTypeCode = _taskArgsTypeCode(task); - buffer.writeln( - ' static final TaskDefinition<$argsTypeCode, ${task.resultTypeCode}> $symbol = TaskDefinition<$argsTypeCode, ${task.resultTypeCode}>(', - ); + final usesNoArgsDefinition = + !task.usesLegacyMapArgs && task.valueParameters.isEmpty; + final singleParameter = _singleValueParameter(task.valueParameters); + if (usesNoArgsDefinition) { + buffer.writeln( + ' static final NoArgsTaskDefinition<${task.resultTypeCode}> $symbol = NoArgsTaskDefinition<${task.resultTypeCode}>(', + ); + } else { + buffer.writeln( + ' static final TaskDefinition<$argsTypeCode, ${task.resultTypeCode}> $symbol = TaskDefinition<$argsTypeCode, ${task.resultTypeCode}>(', + ); + } buffer.writeln(' name: ${_string(task.name)},'); if (task.usesLegacyMapArgs) { buffer.writeln(' encodeArgs: (args) => args,'); - } else if (task.valueParameters.isEmpty) { - buffer.writeln(' encodeArgs: (args) => const {},'); - } else { + } else if (task.valueParameters.isNotEmpty) { buffer.writeln(' encodeArgs: (args) => {'); - for (final parameter in task.valueParameters) { + if (singleParameter != null) { buffer.writeln( - ' ${_string(parameter.name)}: ' - '${_encodeValueExpression('args.${parameter.name}', parameter)},', + ' ${_string(singleParameter.name)}: ' + '${_encodeValueExpression('args', singleParameter)},', ); + } else { + for (final parameter in task.valueParameters) { + buffer.writeln( + ' ${_string(parameter.name)}: ' + '${_encodeValueExpression('args.${parameter.name}', parameter)},', + ); + } } buffer.writeln(' },'); } if (task.options != null) { - buffer.writeln(' defaultOptions: ${_dartObjectToCode(task.options!)},'); + buffer.writeln( + ' defaultOptions: ${_dartObjectToCode(task.options!)},', + ); } if (task.metadata != null) { buffer.writeln(' metadata: ${_dartObjectToCode(task.metadata!)},'); @@ -1785,81 +1994,30 @@ class _RegistryEmitter { } buffer.writeln('}'); buffer.writeln(); - - buffer.writeln('extension StemGeneratedTaskEnqueuer on TaskEnqueuer {'); - for (final task in tasks) { - final symbol = symbolNames[task]!; - final fieldName = _lowerCamel(symbol); - buffer.writeln(' Future enqueue$symbol({'); - if (task.usesLegacyMapArgs) { - buffer.writeln(' required Map args,'); - } else { - for (final parameter in task.valueParameters) { - buffer.writeln( - ' required ${parameter.typeCode} ${parameter.name},', - ); - } - } - buffer.writeln(' Map headers = const {},'); - buffer.writeln(' TaskOptions? options,'); - buffer.writeln(' DateTime? notBefore,'); - buffer.writeln(' Map? meta,'); - buffer.writeln(' TaskEnqueueOptions? enqueueOptions,'); - buffer.writeln(' }) {'); - final callArgs = task.usesLegacyMapArgs - ? 'args' - : task.valueParameters.isEmpty - ? '()' - : '(${task.valueParameters.map((parameter) => '${parameter.name}: ${parameter.name}').join(', ')})'; - buffer.writeln(' return enqueueCall('); - buffer.writeln(' StemTaskDefinitions.$fieldName.call('); - buffer.writeln(' $callArgs,'); - buffer.writeln(' headers: headers,'); - buffer.writeln(' options: options,'); - buffer.writeln(' notBefore: notBefore,'); - buffer.writeln(' meta: meta,'); - buffer.writeln(' enqueueOptions: enqueueOptions,'); - buffer.writeln(' ),'); - buffer.writeln(' );'); - buffer.writeln(' }'); - buffer.writeln(); - } - buffer.writeln('}'); - buffer.writeln(); - - buffer.writeln('extension StemGeneratedTaskResults on Stem {'); - for (final task in tasks) { - final symbol = symbolNames[task]!; - final fieldName = _lowerCamel(symbol); - buffer.writeln( - ' Future?> waitFor$symbol(', - ); - buffer.writeln(' String taskId, {'); - buffer.writeln(' Duration? timeout,'); - buffer.writeln(' }) {'); - buffer.writeln(' return waitForTaskDefinition('); - buffer.writeln(' taskId,'); - buffer.writeln(' StemTaskDefinitions.$fieldName,'); - buffer.writeln(' timeout: timeout,'); - buffer.writeln(' );'); - buffer.writeln(' }'); - buffer.writeln(); - } - buffer.writeln('}'); - buffer.writeln(); } void _emitTaskAdapters(StringBuffer buffer) { - final typedTasks = tasks.where((task) => !task.usesLegacyMapArgs).toList(); - if (typedTasks.isEmpty) { + final adaptedTasks = tasks + .where((task) => task.adapterName != null) + .toList(growable: false); + if (adaptedTasks.isEmpty) { return; } - for (final task in typedTasks) { + for (final task in adaptedTasks) { final adapterName = task.adapterName!; - final callArgs = [ - if (task.acceptsTaskContext) 'context', - ...task.valueParameters.map((param) => _decodeArg('args', param)), - ].join(', '); + final callArgs = _invocationArgs( + positional: [ + if (task.acceptsTaskContext && !task.taskContextIsNamed) 'context', + if (task.usesLegacyMapArgs) + 'args' + else + ...task.valueParameters.map((param) => _decodeArg('args', param)), + ], + named: { + if (task.acceptsTaskContext && task.taskContextIsNamed) + task.taskContextParameterName!: 'context', + }, + ); buffer.writeln( 'Future $adapterName(TaskInvocationContext context, Map args) async {', ); @@ -1872,17 +2030,6 @@ class _RegistryEmitter { } void _emitGeneratedHelpers(StringBuffer buffer) { - final needsScriptStepNoop = workflows.any( - (workflow) => - workflow.kind == WorkflowKind.script && workflow.steps.isNotEmpty, - ); - if (needsScriptStepNoop) { - buffer.writeln( - 'Future _stemScriptManifestStepNoop(FlowContext context) async => null;', - ); - buffer.writeln(); - } - final needsArgHelper = tasks.any((task) => !task.usesLegacyMapArgs) || workflows.any( @@ -2004,7 +2151,10 @@ class _RegistryEmitter { 'as ${parameter.typeCode})'; } - String _encodeValueExpression(String expression, _ValueParameterInfo parameter) { + String _encodeValueExpression( + String expression, + _ValueParameterInfo parameter, + ) { final codecTypeCode = parameter.payloadCodecTypeCode; if (codecTypeCode == null) { return expression; @@ -2022,6 +2172,10 @@ class _RegistryEmitter { if (task.valueParameters.isEmpty) { return '()'; } + final singleParameter = _singleValueParameter(task.valueParameters); + if (singleParameter != null) { + return singleParameter.typeCode; + } final fields = task.valueParameters .map((parameter) => '${parameter.typeCode} ${parameter.name}') .join(', '); @@ -2029,18 +2183,31 @@ class _RegistryEmitter { } String _workflowArgsTypeCode(_WorkflowInfo workflow) { - if (workflow.kind != WorkflowKind.script) { - return 'Map'; - } - if (workflow.runValueParameters.isEmpty) { + final parameters = workflow.kind == WorkflowKind.script + ? workflow.runValueParameters + : workflow.steps.first.valueParameters; + if (parameters.isEmpty) { return '()'; } - final fields = workflow.runValueParameters + final singleParameter = _singleValueParameter(parameters); + if (singleParameter != null) { + return singleParameter.typeCode; + } + final fields = parameters .map((parameter) => '${parameter.typeCode} ${parameter.name}') .join(', '); return '({$fields})'; } + _ValueParameterInfo? _singleValueParameter( + List<_ValueParameterInfo> parameters, + ) { + if (parameters.length != 1) { + return null; + } + return parameters.single; + } + String _qualify(String alias, String symbol) { if (alias.isEmpty) return symbol; return '$alias.$symbol'; diff --git a/packages/stem_builder/pubspec.yaml b/packages/stem_builder/pubspec.yaml index 0868c198..43b5b059 100644 --- a/packages/stem_builder/pubspec.yaml +++ b/packages/stem_builder/pubspec.yaml @@ -1,6 +1,6 @@ name: stem_builder description: Build-time registry generator for annotated Stem workflows and tasks. -version: 0.1.0 +version: 0.2.0 repository: https://github.com/kingwill101/stem resolution: workspace @@ -13,7 +13,7 @@ dependencies: dart_style: ^3.1.4 glob: ^2.1.3 source_gen: ^4.1.2 - stem: ^0.1.1 + stem: ^0.2.0 dev_dependencies: build_runner: ^2.10.5 diff --git a/packages/stem_builder/test/stem_registry_builder_test.dart b/packages/stem_builder/test/stem_registry_builder_test.dart index 4ec8967f..4f2cb112 100644 --- a/packages/stem_builder/test/stem_registry_builder_test.dart +++ b/packages/stem_builder/test/stem_registry_builder_test.dart @@ -6,15 +6,31 @@ import 'package:test/test.dart'; const stubStem = ''' library stem; -class FlowContext {} +class WorkflowExecutionContext {} +class FlowContext implements WorkflowExecutionContext {} typedef _FlowStepHandler = Future Function(FlowContext context); enum WorkflowStepKind { task, choice, parallel, wait, custom } class PayloadCodec { - const PayloadCodec({required this.encode, required this.decode}); + const PayloadCodec({required this.encode, required this.decode}) + : typeName = null; + const PayloadCodec.map({ + required this.encode, + required T Function(Map payload) decode, + this.typeName, + }) : decode = _unsupportedDecode; + const PayloadCodec.json({ + required T Function(Map payload) decode, + this.typeName, + }) : encode = _unsupportedEncode, + decode = _unsupportedDecode; final Object? Function(T value) encode; final T Function(Object? payload) decode; + final String? typeName; + + static Object? _unsupportedEncode(T value) => throw UnimplementedError(); + static T _unsupportedDecode(Object? payload) => throw UnimplementedError(); } class FlowStep { @@ -37,6 +53,24 @@ class FlowStep { final List taskNames; final Map? metadata; } +class WorkflowCheckpoint { + WorkflowCheckpoint({ + required this.name, + this.autoVersion = false, + this.valueCodec, + this.title, + this.kind = WorkflowStepKind.task, + this.taskNames = const [], + this.metadata, + }); + final String name; + final bool autoVersion; + final PayloadCodec? valueCodec; + final String? title; + final WorkflowStepKind kind; + final List taskNames; + final Map? metadata; +} class WorkflowScriptContext { Future step( String name, @@ -44,8 +78,23 @@ class WorkflowScriptContext { bool autoVersion = false, }) async => throw UnimplementedError(); } -class WorkflowScriptStepContext {} -class TaskInvocationContext {} +class WorkflowScriptStepContext implements WorkflowExecutionContext {} +abstract class TaskExecutionContext {} +class TaskInvocationContext implements TaskExecutionContext {} + +class NoArgsTaskDefinition { + const NoArgsTaskDefinition({ + required this.name, + this.defaultOptions = const TaskOptions(), + this.metadata = const TaskMetadata(), + this.decodeResult, + }); + + final String name; + final TaskOptions defaultOptions; + final TaskMetadata metadata; + final T Function(Object? payload)? decodeResult; +} class TaskOptions { const TaskOptions({this.maxRetries = 0}); @@ -107,8 +156,7 @@ class WorkflowScript { WorkflowScript({ required String name, required dynamic run, - List steps = const [], - List checkpoints = const [], + List checkpoints = const [], PayloadCodec? resultCodec, }); } @@ -175,23 +223,23 @@ Future sendEmail( AssetId('stem', 'lib/stem.dart'), stubStem, ), - outputs: { - 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( - allOf([ - contains('StemWorkflowDefinitions'), - contains('StemTaskDefinitions'), - contains('StemGeneratedTaskEnqueuer'), - contains('StemGeneratedTaskResults'), - contains('waitForSendEmail('), - contains('WorkflowRef, String>'), - contains('Flow('), - contains('WorkflowScript('), - contains('stemModule = StemModule('), - contains('FunctionTaskHandler'), - contains("part of 'workflows.dart';"), - ]), - ), - }, + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains('StemWorkflowDefinitions'), + contains('StemTaskDefinitions'), + contains('NoArgsWorkflowRef'), + contains('Flow('), + contains('WorkflowScript('), + contains('stemModule = StemModule('), + contains('FunctionTaskHandler'), + contains("part of 'workflows.dart';"), + isNot(contains('StemGeneratedTaskEnqueuer')), + isNot(contains('StemGeneratedTaskResults')), + isNot(contains('waitForSendEmail(')), + ]), + ), + }, ); }); @@ -265,11 +313,11 @@ class DailyBillingWorkflow { 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( allOf([ contains( - 'static final WorkflowRef, Object?> ' + 'static final NoArgsWorkflowRef ' 'helloFlow =', ), contains( - 'static final WorkflowRef<({String tenant}), Object?> ' + 'static final WorkflowRef ' 'dailyBilling =', ), ]), @@ -279,6 +327,192 @@ class DailyBillingWorkflow { }, ); + test('uses NoArgsWorkflowRef for zero-argument script workflows', () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(kind: WorkflowKind.script) +class HelloScriptWorkflow { + @WorkflowRun() + Future run() async => 'done'; +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains( + 'static final NoArgsWorkflowRef ' + 'helloScriptWorkflow =', + ), + contains('NoArgsWorkflowRef('), + isNot(contains('startHelloScriptWorkflow(')), + isNot(contains('startAndWaitHelloScriptWorkflow(')), + isNot(contains('waitForHelloScriptWorkflow(')), + ]), + ), + }, + ); + }); + + test('uses NoArgsTaskDefinition for zero-argument tasks', () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@TaskDefn(name: 'ping.task') +Future pingTask() async => 'pong'; +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains('static final NoArgsTaskDefinition pingTask ='), + contains('NoArgsTaskDefinition('), + isNot(contains('enqueuePingTask(')), + isNot(contains('enqueueAndWaitPingTask(')), + isNot(contains('encodeArgs: (args) => const {}')), + ]), + ), + }, + ); + }); + + test('generates direct helpers for typed annotated tasks', () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +class EmailRequest { + const EmailRequest({required this.email}); + final String email; + Map toJson() => {'email': email}; + factory EmailRequest.fromJson(Map json) => + EmailRequest(email: json['email'] as String); +} + +@TaskDefn(name: 'email.send') +Future sendEmail(EmailRequest request) async => request.email; +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains( + 'static final TaskDefinition emailSend =', + ), + isNot(contains('enqueueEmailSend(')), + isNot(contains('enqueueAndWaitEmailSend(')), + ]), + ), + }, + ); + }); + + test('generates direct helpers for typed workflows', () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(kind: WorkflowKind.script) +class SignupWorkflow { + Future run(String email) async => email; +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains( + 'static final WorkflowRef signupWorkflow =', + ), + isNot(contains('startSignupWorkflow(')), + isNot(contains('startAndWaitSignupWorkflow(')), + isNot(contains('waitForSignupWorkflow(')), + ]), + ), + }, + ); + }); + + test('generates typed workflow refs for annotated flows', () async { + const input = r''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn() +class GreetingFlow { + @WorkflowStep() + Future greet(String name) async => 'hello \$name'; +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains( + 'static final WorkflowRef greetingFlow =', + ), + isNot(contains('startGreetingFlow(')), + isNot(contains('startAndWaitGreetingFlow(')), + isNot(contains('waitForGreetingFlow(')), + ]), + ), + }, + ); + }); + test( 'generates script workflow step proxies for direct method calls', () async { @@ -441,7 +675,7 @@ class DuplicateCheckpointWorkflow { test( 'rejects manual checkpoint names that conflict with annotated ones', () async { - const input = ''' + const input = ''' import 'package:stem/stem.dart'; part 'workflows.stem.g.dart'; @@ -458,23 +692,127 @@ class DuplicateManualCheckpointWorkflow { } '''; - final result = await testBuilder( - stemRegistryBuilder(BuilderOptions.empty), - {'stem_builder|lib/workflows.dart': input}, - rootPackage: 'stem_builder', - readerWriter: TestReaderWriter(rootPackage: 'stem_builder') - ..testing.writeString( - AssetId('stem', 'lib/stem.dart'), - stubStem, + final result = await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + ); + expect(result.succeeded, isFalse); + expect(result.errors.join('\n'), contains('manual checkpoint')); + expect( + result.errors.join('\n'), + contains('conflicts with annotated checkpoint'), + ); + }, + ); + + test( + 'warns when manual script.step wraps an annotated checkpoint call', + () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(kind: WorkflowKind.script) +class MixedCheckpointWorkflow { + @WorkflowRun() + Future run(WorkflowScriptContext script) async { + await script.step('outer-wrapper', (ctx) => sendEmail('user@example.com')); + } + + @WorkflowStep(name: 'send-email') + Future sendEmail(String email) async {} +} +'''; + + final records = []; + await testBuilders( + [stemRegistryBuilder(BuilderOptions.empty)], + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + onLog: records.add, + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + ); + + expect( + records, + contains( + warningLogOf( + allOf([ + contains('wraps annotated checkpoint "send-email"'), + contains('outer-wrapper'), + contains('avoid nested checkpoints'), + ]), + ), ), + ); + }, + ); + + test( + 'warns when manual script.step wraps a context-aware annotated ' + 'checkpoint call', + () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(kind: WorkflowKind.script) +class MixedContextCheckpointWorkflow { + @WorkflowRun() + Future run(WorkflowScriptContext script) async { + await script.step( + 'outer-wrapper', + (ctx) => capture('user@example.com', context: ctx), ); - expect(result.succeeded, isFalse); - expect(result.errors.join('\n'), contains('manual checkpoint')); - expect( - result.errors.join('\n'), - contains('conflicts with annotated checkpoint'), - ); - }); + } + + @WorkflowStep(name: 'capture') + Future capture( + String email, { + WorkflowScriptStepContext? context, + }) async {} +} +'''; + + final records = []; + await testBuilders( + [stemRegistryBuilder(BuilderOptions.empty)], + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + onLog: records.add, + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + ); + + expect( + records, + contains( + warningLogOf( + allOf([ + contains('wraps annotated checkpoint "capture"'), + contains('outer-wrapper'), + contains('avoid nested checkpoints'), + ]), + ), + ), + ); + }, + ); test( 'decodes serializable @workflow.run parameters from script params', @@ -518,8 +856,8 @@ class SignupWorkflow { contains('_stemRequireArg(script.params, "email") as String'), contains('abstract final class StemWorkflowDefinitions'), contains( - 'signupWorkflow = WorkflowRef<({String email}), ' - 'Map>(', + 'static final WorkflowRef> ' + 'signupWorkflow =', ), isNot(contains('extraParams')), ]), @@ -573,6 +911,232 @@ class SignupWorkflow { }, ); + test( + 'supports optional named WorkflowScriptContext injection', + () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(kind: WorkflowKind.script) +class SignupWorkflow { + Future run(String email, {WorkflowScriptContext? context}) async { + await sendWelcomeEmail(email); + } + + @WorkflowStep() + Future sendWelcomeEmail(String email) async {} +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains( + ').run((' + '_stemRequireArg(script.params, "email") as String), ' + 'context: script)', + ), + ]), + ), + }, + ); + }, + ); + + test( + 'supports optional named WorkflowScriptStepContext injection', + () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(kind: WorkflowKind.script) +class SignupWorkflow { + Future run(String email) async => sendWelcomeEmail(email); + + @WorkflowStep() + Future sendWelcomeEmail( + String email, { + WorkflowScriptStepContext? context, + }) async => email; +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains( + '(context) => super.sendWelcomeEmail(email, context: context)', + ), + contains('WorkflowScriptStepContext? context'), + ]), + ), + }, + ); + }, + ); + + test( + 'supports optional named WorkflowExecutionContext injection ' + 'in script checkpoints', + () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(kind: WorkflowKind.script) +class SignupWorkflow { + Future run(String email) async => sendWelcomeEmail(email); + + @WorkflowStep() + Future sendWelcomeEmail( + String email, { + WorkflowExecutionContext? context, + }) async => email; +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains( + '(context) => super.sendWelcomeEmail(email, context: context)', + ), + contains('WorkflowExecutionContext? context'), + ]), + ), + }, + ); + }, + ); + + test('supports optional named FlowContext injection', () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(name: 'hello.flow') +class HelloWorkflow { + @WorkflowStep(name: 'step-1') + Future stepOne({FlowContext? context}) async => 'ok'; +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + contains('(ctx) => impl.stepOne(context: ctx)'), + ), + }, + ); + }); + + test( + 'supports optional named WorkflowExecutionContext injection in flow steps', + () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@WorkflowDefn(name: 'hello.flow') +class HelloWorkflow { + @WorkflowStep(name: 'step-1') + Future stepOne({WorkflowExecutionContext? context}) async => 'ok'; +} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + contains('(ctx) => impl.stepOne(context: ctx)'), + ), + }, + ); + }, + ); + + test('supports optional named TaskExecutionContext injection', () async { + const input = ''' +import 'package:stem/stem.dart'; + +part 'workflows.stem.g.dart'; + +@TaskDefn(name: 'typed.task') +Future typedTask( + String email, { + TaskExecutionContext? context, +}) async {} +'''; + + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + allOf([ + contains('typedTask((_stemRequireArg(args, "email") as String),'), + contains('context: context'), + ]), + ), + }, + ); + }); + test('rejects non-serializable @workflow.run parameter types', () async { const input = ''' import 'package:stem/stem.dart'; @@ -680,7 +1244,7 @@ Future typedTask( test( 'generates codec-backed DTO helpers for workflow and task types', () async { - const input = ''' + const input = ''' import 'package:stem/stem.dart'; part 'workflows.stem.g.dart'; @@ -696,7 +1260,7 @@ class EmailRequest { 'retries': retries, }; - factory EmailRequest.fromJson(Map json) => EmailRequest( + factory EmailRequest.fromJson(Map json) => EmailRequest( email: json['email'] as String, retries: json['retries'] as int, ); @@ -717,27 +1281,28 @@ Future dtoTask( ) async => request; '''; - await testBuilder( - stemRegistryBuilder(BuilderOptions.empty), - {'stem_builder|lib/workflows.dart': input}, - rootPackage: 'stem_builder', - readerWriter: TestReaderWriter(rootPackage: 'stem_builder') - ..testing.writeString( - AssetId('stem', 'lib/stem.dart'), - stubStem, - ), - outputs: { - 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( + await testBuilder( + stemRegistryBuilder(BuilderOptions.empty), + {'stem_builder|lib/workflows.dart': input}, + rootPackage: 'stem_builder', + readerWriter: TestReaderWriter(rootPackage: 'stem_builder') + ..testing.writeString( + AssetId('stem', 'lib/stem.dart'), + stubStem, + ), + outputs: { + 'stem_builder|lib/workflows.stem.g.dart': decodedMatches( allOf([ contains('abstract final class StemPayloadCodecs'), contains('PayloadCodec emailRequest ='), + contains('PayloadCodec.json('), contains( - 'WorkflowRef<({EmailRequest request}), EmailRequest> script =', + 'WorkflowRef script =', ), - contains('encode: (value) => value.toJson(),'), - contains('EmailRequest.fromJson('), + contains('decode: EmailRequest.fromJson,'), + contains('typeName: "EmailRequest",'), contains( - 'StemPayloadCodecs.emailRequest.encode(params.request)', + 'StemPayloadCodecs.emailRequest.encode(params)', ), contains('StemPayloadCodecs.emailRequest.decode('), contains( @@ -752,10 +1317,11 @@ Future dtoTask( contains('valueCodec: StemPayloadCodecs.emailRequest,'), contains('resultCodec: StemPayloadCodecs.emailRequest,'), ]), - ), - }, - ); - }); + ), + }, + ); + }, + ); test('rejects non-serializable workflow step parameter types', () async { const input = ''' diff --git a/packages/stem_cli/lib/src/cli/workflow.dart b/packages/stem_cli/lib/src/cli/workflow.dart index 3c6d08d1..acefcfd1 100644 --- a/packages/stem_cli/lib/src/cli/workflow.dart +++ b/packages/stem_cli/lib/src/cli/workflow.dart @@ -510,39 +510,31 @@ class _WorkflowShowCommand extends Command { dependencies.err.writeln('Workflow run "$runId" not found.'); return 64; } - final steps = await workflowContext.store.listSteps(runId); + final detail = await workflowContext.runtime.viewRunDetail(runId); if (jsonOutput) { dependencies.out.writeln( - jsonEncode({ - 'run': { - 'id': state.id, - 'workflow': state.workflow, - 'status': state.status.name, - 'cursor': state.cursor, - 'params': state.params, - 'result': state.result, - 'waitTopic': state.waitTopic, - 'resumeAt': state.resumeAt?.toIso8601String(), - 'lastError': state.lastError, - 'createdAt': state.createdAt.toIso8601String(), - 'updatedAt': state.updatedAt?.toIso8601String(), - 'cancellationPolicy': state.cancellationPolicy?.toJson(), - 'cancellationData': state.cancellationData, - }, - 'steps': steps - .map( - (step) => { - 'name': step.name, - 'value': step.value, - 'position': step.position, - 'completedAt': step.completedAt?.toIso8601String(), + jsonEncode( + detail?.toJson() ?? + { + 'run': { + 'runId': state.id, + 'workflow': state.workflow, + 'status': state.status.name, + 'cursor': state.cursor, + 'params': state.params, + 'result': state.result, + 'lastError': state.lastError, + 'createdAt': state.createdAt.toIso8601String(), + 'updatedAt': state.updatedAt?.toIso8601String(), + 'runtime': state.runtimeMetadata.toJson(), + 'suspensionData': state.suspensionData, }, - ) - .toList(), - }), + 'checkpoints': const [], + }, + ), ); } else { - _renderRunDetails(state, steps); + _renderRunDetails(state, detail?.checkpoints ?? const []); } return 0; } catch (error, stackTrace) { @@ -557,7 +549,10 @@ class _WorkflowShowCommand extends Command { } } - void _renderRunDetails(RunState state, List steps) { + void _renderRunDetails( + RunState state, + List checkpoints, + ) { final out = dependencies.out; out ..writeln('Run: ${state.id}') @@ -582,13 +577,14 @@ class _WorkflowShowCommand extends Command { if (state.cancellationData != null && state.cancellationData!.isNotEmpty) { out.writeln('Cancellation Data: ${jsonEncode(state.cancellationData)}'); } - if (steps.isEmpty) { + if (checkpoints.isEmpty) { out.writeln('No checkpoints recorded.'); } else { out.writeln('Checkpoints:'); - for (final step in steps) { + for (final checkpoint in checkpoints) { out.writeln( - ' [${step.position}] ${step.name}: ${jsonEncode(step.value)}', + ' [${checkpoint.position}] ${checkpoint.checkpointName}: ' + '${jsonEncode(checkpoint.value)}', ); } } diff --git a/packages/stem_cli/lib/src/cli/workflow_agent_help.dart b/packages/stem_cli/lib/src/cli/workflow_agent_help.dart index 6bfc8396..55687704 100644 --- a/packages/stem_cli/lib/src/cli/workflow_agent_help.dart +++ b/packages/stem_cli/lib/src/cli/workflow_agent_help.dart @@ -6,8 +6,8 @@ String buildWorkflowAgentHelpMarkdown(Iterable> commands) { ..writeln() ..writeln('## Summary') ..writeln( - '- Workflow steps are durable and may replay after sleeps, awaited ' - 'events, or worker restarts.', + '- Flow steps and script checkpoints are durable and may replay after ' + 'sleeps, awaited events, or worker restarts.', ) ..writeln( '- Use FlowContext.idempotencyKey and stored checkpoints to guard side ' diff --git a/packages/stem_cli/pubspec.yaml b/packages/stem_cli/pubspec.yaml index 6740cfe5..ede5f621 100644 --- a/packages/stem_cli/pubspec.yaml +++ b/packages/stem_cli/pubspec.yaml @@ -8,7 +8,7 @@ environment: dependencies: artisanal: ^0.2.0 - stem: ^0.1.0 + stem: ^0.2.0 stem_redis: ^0.1.0 stem_postgres: ^0.1.0 stem_sqlite: ^0.1.0 diff --git a/packages/stem_cli/test/unit/cli/cli_workflow_test.dart b/packages/stem_cli/test/unit/cli/cli_workflow_test.dart index afee1eb0..6e9bc055 100644 --- a/packages/stem_cli/test/unit/cli/cli_workflow_test.dart +++ b/packages/stem_cli/test/unit/cli/cli_workflow_test.dart @@ -98,7 +98,7 @@ void main() { ); }); - test('show --json displays run details and steps', () async { + test('show --json displays run details and checkpoints', () async { await runStemCli( ['wf', 'start', 'demo.workflow'], contextBuilder: _buildCliContext, @@ -118,8 +118,8 @@ void main() { expect(code, equals(0), reason: err.toString()); final payload = jsonDecode(out.toString()) as Map; - expect(payload['run']['id'], run.id); - expect(payload['steps'], isList); + expect(payload['run']['runId'], run.id); + expect(payload['checkpoints'], isList); }); test('start accepts cancellation policy flags', () async { diff --git a/packages/stem_memory/pubspec.yaml b/packages/stem_memory/pubspec.yaml index f8aa99d8..cdf0dfb2 100644 --- a/packages/stem_memory/pubspec.yaml +++ b/packages/stem_memory/pubspec.yaml @@ -8,7 +8,7 @@ environment: dependencies: collection: ^1.19.1 - stem: ^0.1.1 + stem: ^0.2.0 uuid: ^4.5.2 dev_dependencies: diff --git a/packages/stem_postgres/pubspec.yaml b/packages/stem_postgres/pubspec.yaml index 742045af..658c02ad 100644 --- a/packages/stem_postgres/pubspec.yaml +++ b/packages/stem_postgres/pubspec.yaml @@ -14,7 +14,7 @@ dependencies: ormed_postgres: ^0.2.0 path: ^1.9.1 postgres: ^3.5.9 - stem: ^0.1.1 + stem: ^0.2.0 uuid: ^4.5.2 dev_dependencies: diff --git a/packages/stem_postgres/test/support/postgres_test_harness.dart b/packages/stem_postgres/test/support/postgres_test_harness.dart index a2b4af25..0d485dba 100644 --- a/packages/stem_postgres/test/support/postgres_test_harness.dart +++ b/packages/stem_postgres/test/support/postgres_test_harness.dart @@ -45,6 +45,7 @@ Future createStemPostgresTestHarness({ final config = setUpOrmed( dataSource: dataSource, runMigrations: _runTestMigrations, + migrateBaseDatabase: false, adapterFactory: (dbName) { final schemaUrl = _withSearchPath(connectionString, dbName); return PostgresDriverAdapter.custom( diff --git a/packages/stem_redis/pubspec.yaml b/packages/stem_redis/pubspec.yaml index e2847326..fde9cf43 100644 --- a/packages/stem_redis/pubspec.yaml +++ b/packages/stem_redis/pubspec.yaml @@ -10,7 +10,7 @@ dependencies: async: ^2.13.0 collection: ^1.19.1 redis: ^4.0.0 - stem: ^0.1.1 + stem: ^0.2.0 uuid: ^4.5.2 dev_dependencies: diff --git a/packages/stem_sqlite/pubspec.yaml b/packages/stem_sqlite/pubspec.yaml index cad34eb3..774daa4a 100644 --- a/packages/stem_sqlite/pubspec.yaml +++ b/packages/stem_sqlite/pubspec.yaml @@ -14,7 +14,7 @@ dependencies: ormed: ^0.2.0 ormed_sqlite: ^0.2.0 path: ^1.9.1 - stem: ^0.1.1 + stem: ^0.2.0 uuid: ^4.5.2 dev_dependencies: