diff --git a/SKILL.md b/SKILL.md index 92b6f9a..43942d3 100644 --- a/SKILL.md +++ b/SKILL.md @@ -1,6 +1,6 @@ --- name: temporal-developer -description: This skill should be used when the user asks to "create a Temporal workflow", "write a Temporal activity", "debug stuck workflow", "fix non-determinism error", "Temporal Python", "workflow replay", "activity timeout", "signal workflow", "query workflow", "worker not starting", "activity keeps retrying", "Temporal heartbeat", "continue-as-new", "child workflow", "saga pattern", "workflow versioning", "durable execution", "reliable distributed systems", or mentions Temporal SDK development. +description: This skill should be used when the user asks to "create a Temporal workflow", "write a Temporal activity", "debug stuck workflow", "fix non-determinism error", "Temporal Python", "Temporal TypeScript", "workflow replay", "activity timeout", "signal workflow", "query workflow", "worker not starting", "activity keeps retrying", "Temporal heartbeat", "continue-as-new", "child workflow", "saga pattern", "workflow versioning", "durable execution", "reliable distributed systems", or mentions Temporal SDK development. version: 1.0.0 --- @@ -8,7 +8,7 @@ version: 1.0.0 ## Overview -Temporal is a durable execution platform that makes workflows survive failures automatically. This skill provides guidance for building Temporal applications in Python. +Temporal is a durable execution platform that makes workflows survive failures automatically. This skill provides guidance for building Temporal applications in Python and TypeScript. ## Core Architecture @@ -91,6 +91,7 @@ Once you've downloaded the file, extract the downloaded archive and add the temp 1. First, read the getting started guide for the language you are working in: - Python -> read `references/python/python.md` + - TypeScript -> read `references/typescript/typescript.md` 2. Second, read appropriate `core` and language-specific references for the task at hand. @@ -108,7 +109,7 @@ Once you've downloaded the file, extract the downloaded archive and add the temp - **`references/core/interactive-workflows.md`** - Testing signals, updates, queries - **`references/core/dev-management.md`** - Dev cycle & management of server and workers - **`references/core/ai-patterns.md`** - AI/LLM pattern concepts - + Langauge-specific info at `references/{your_language}/determinism.md` + + Language-specific info at `references/{your_language}/determinism.md`, if available. Currently Python only. ## Additional Topics - **`references/{your_langauge}/observability.md`** - See for language-specific implementation guidance on observability in Temporal diff --git a/references/core/determinism.md b/references/core/determinism.md index c6e3a04..70e74b8 100644 --- a/references/core/determinism.md +++ b/references/core/determinism.md @@ -79,6 +79,7 @@ For a few simple cases, like timestamps, random values, UUIDs, etc. the Temporal Each Temporal SDK language provides a protection mechanism to make it easier to catch non-determinism errors earlier in development: - Python: The Python SDK runs workflows in a sandbox that intercepts and aborts non-deterministic calls at runtime. +- TypeScript: The TypeScript SDK runs workflows in an isolated V8 sandbox, intercepting many common sources of non-determinism and replacing them automatically with deterministic variants. ## Detecting Non-Determinism diff --git a/references/core/gotchas.md b/references/core/gotchas.md index 1206494..55b6ddb 100644 --- a/references/core/gotchas.md +++ b/references/core/gotchas.md @@ -150,3 +150,47 @@ See language-specific gotchas for details. **The Fix**: - **Retryable**: Network errors, timeouts, rate limits, temporary unavailability - **Non-retryable**: Invalid input, authentication failures, business rule violations, resource not found + +## Cancellation Handling + +### Not Handling Workflow Cancellation + +**The Problem**: When a workflow is cancelled, cleanup code after the cancellation point doesn't run unless explicitly protected. + +**Symptoms**: +- Resources not released after cancellation +- Incomplete compensation/rollback +- Leaked state + +**The Fix**: Use language-specific cancellation scopes or try/finally blocks to ensure cleanup runs even on cancellation. See language-specific gotchas for implementation details. + +### Not Handling Activity Cancellation + +**The Problem**: Activities must opt in to receive cancellation. Without proper handling, a cancelled activity continues running to completion, wasting resources. + +**Requirements for activity cancellation**: +1. **Heartbeating** - Cancellation is delivered via heartbeat. Activities that don't heartbeat won't know they've been cancelled. +2. **Checking for cancellation** - Activity must explicitly check for cancellation or await a cancellation signal. + +**Symptoms**: +- Cancelled activities running to completion +- Wasted compute on work that will be discarded +- Delayed workflow cancellation + +**The Fix**: Heartbeat regularly and check for cancellation. See language-specific gotchas for implementation patterns. + +## Payload Size Limits + +**The Problem**: Temporal has built-in limits on payload sizes. Exceeding them causes workflows to fail. + +**Limits**: +- Max 2MB per individual payload +- Max 4MB per gRPC message +- Max 50MB for complete workflow history (aim for <10MB in practice) + +**Symptoms**: +- Payload too large errors +- gRPC message size exceeded errors +- Workflow history growing unboundedly + +**The Fix**: Store large data externally (S3/GCS) and pass references, use compression codecs, or chunk data across multiple activities. See the Large Data Handling pattern in `references/core/patterns.md`. diff --git a/references/core/patterns.md b/references/core/patterns.md index 2f37465..93f774d 100644 --- a/references/core/patterns.md +++ b/references/core/patterns.md @@ -331,6 +331,80 @@ Run: This ensures that on replay, already-completed steps are skipped. +## Large Data Handling + +**Purpose**: Handle data that exceeds Temporal's payload limits without polluting workflow history. + +**Limits** (see `references/core/gotchas.md` for details): +- Max 2MB per individual payload +- Max 4MB per gRPC message +- Max 50MB for workflow history (aim for <10MB) + +**Key Principle**: Large data should never flow through workflow history. Activities read and write large data directly, passing only small references through the workflow. + +**Wrong Approach**: +``` +Workflow + │ + ├── downloadFromStorage(ref) ──▶ returns large data (enters history) + │ + ├── processData(largeData) ────▶ large data as argument (enters history AGAIN) + │ + └── uploadToStorage(result) ───▶ large data as argument (enters history AGAIN) +``` + +This defeats the purpose—large data enters workflow history multiple times. + +**Correct Approach**: +``` +Workflow + │ + └── processLargeData(inputRef) ──▶ returns outputRef (small string) + │ + └── Activity internally: + download(inputRef) → process → upload → return outputRef +``` + +The workflow only handles references (small strings). The activity does all large data operations internally. + +**Implementation Pattern**: +1. Accept a reference (URL, S3 key, database ID) as activity input +2. Download/fetch the large data inside the activity +3. Process the data inside the activity +4. Upload/store the result inside the activity +5. Return only a reference to the result + +**Other Strategies**: +- **Compression**: Use a PayloadCodec to compress data automatically +- **Chunking**: Split large collections across multiple activities, each handling a subset + +## Activity Heartbeating + +**Purpose**: Enable cancellation delivery and progress tracking for long-running activities. + +**Why Heartbeat**: +1. **Support activity cancellation** - Cancellations are delivered to activities via heartbeat. Activities that don't heartbeat won't know they've been cancelled. +2. **Resume progress after failure** - Heartbeat details persist across retries, allowing activities to resume where they left off. +3. **Detect stuck activities** - If an activity stops heartbeating, Temporal can time it out and retry. + +**How Cancellation Works**: +``` +Workflow requests activity cancellation + │ + ▼ +Temporal Service marks activity for cancellation + │ + ▼ +Activity calls heartbeat() + │ + ├── Not cancelled: heartbeat succeeds, continues + │ + └── Cancelled: heartbeat raises exception + Activity can catch this to perform cleanup +``` + +**Key Point**: If an activity never heartbeats, it will run to completion even if cancelled—it has no way to learn about the cancellation. + ## Local Activities **Purpose**: Reduce latency for short, lightweight operations by skipping the task queue. ONLY use these when necessary for performance. Do NOT use these by default, as they are not durable and distributed. diff --git a/references/core/versioning.md b/references/core/versioning.md index 5cbc34c..226bb83 100644 --- a/references/core/versioning.md +++ b/references/core/versioning.md @@ -53,17 +53,21 @@ else: ### When to Use -- Adding new activities or steps -- Changing activity parameters -- Reordering operations -- Any change that would cause non-determinism +- Adding, removing, or reordering activities/child workflows +- Changing which activity/child workflow is called +- Any change that alters the Command sequence ### When NOT to Use -- Changes to activity implementations (activities aren't replayed) -- Adding new signal/query handlers (additive changes are safe) +- Changing activity implementations (activities aren't replayed) +- Changing arguments passed to activities or child workflows +- Changing retry policies +- Changing timer durations +- Adding new signal/query/update handlers (additive changes are safe) - Bug fixes that don't change Command sequence +Unnecessary patching adds complexity and can make workflow code unmanageable. + ## Approach 2: Workflow Type Versioning ### Concept diff --git a/references/python/data-handling.md b/references/python/data-handling.md index d6a40d2..662101e 100644 --- a/references/python/data-handling.md +++ b/references/python/data-handling.md @@ -202,29 +202,6 @@ class OrderWorkflow: ... ``` -## Large Payloads - -For large data, consider: - -1. **Store externally**: Put large data in S3/GCS, pass references in workflows -2. **Use Payload Codec**: Compress payloads automatically -3. **Chunk data**: Split large lists across multiple activities - -```python -# Example: Reference pattern for large data -@activity.defn -async def upload_to_storage(data: bytes) -> str: - """Upload data and return reference.""" - key = f"data/{uuid.uuid4()}" - await storage_client.upload(key, data) - return key - -@activity.defn -async def download_from_storage(key: str) -> bytes: - """Download data by reference.""" - return await storage_client.download(key) -``` - ## Deterministic APIs for Values Use these APIs within workflows for deterministic random values and UUIDs: @@ -247,8 +224,7 @@ class MyWorkflow: ## Best Practices 1. Use Pydantic for input/output validation -2. Keep payloads small (< 2MB recommended) +2. Keep payloads small—see `references/core/gotchas.md` for limits 3. Encrypt sensitive data with PayloadCodec -4. Store large data externally with references -5. Use dataclasses for simple data structures -6. Use `workflow.uuid4()` and `workflow.random()` for deterministic values +4. Use dataclasses for simple data structures +5. Use `workflow.uuid4()` and `workflow.random()` for deterministic values diff --git a/references/python/gotchas.md b/references/python/gotchas.md index b0735b1..95ebe8a 100644 --- a/references/python/gotchas.md +++ b/references/python/gotchas.md @@ -161,6 +161,83 @@ await workflow.execute_activity( ) ``` +Set heartbeat timeout as high as acceptable for your use case — each heartbeat counts as an action. + +## Cancellation + +### Not Handling Workflow Cancellation + +```python +# BAD - Cleanup doesn't run on cancellation +@workflow.defn +class BadWorkflow: + @workflow.run + async def run(self) -> None: + await workflow.execute_activity( + acquire_resource, + start_to_close_timeout=timedelta(minutes=5), + ) + await workflow.execute_activity( + do_work, + start_to_close_timeout=timedelta(minutes=5), + ) + await workflow.execute_activity( + release_resource, # Never runs if cancelled! + start_to_close_timeout=timedelta(minutes=5), + ) + +# GOOD - Use try/finally for cleanup +@workflow.defn +class GoodWorkflow: + @workflow.run + async def run(self) -> None: + await workflow.execute_activity( + acquire_resource, + start_to_close_timeout=timedelta(minutes=5), + ) + try: + await workflow.execute_activity( + do_work, + start_to_close_timeout=timedelta(minutes=5), + ) + finally: + # Runs even on cancellation + await workflow.execute_activity( + release_resource, + start_to_close_timeout=timedelta(minutes=5), + ) +``` + +### Not Handling Activity Cancellation + +Activities must **opt in** to receive cancellation. This requires: +1. **Heartbeating** - Cancellation is delivered via heartbeat +2. **Catching the cancellation exception** - Exception is raised when heartbeat detects cancellation + +**Cancellation exceptions:** +- Async activities: `asyncio.CancelledError` +- Sync threaded activities: `temporalio.exceptions.CancelledError` + +```python +# BAD - Activity ignores cancellation +@activity.defn +async def long_activity() -> None: + await do_expensive_work() # Runs to completion even if cancelled +``` + +```python +# GOOD - Heartbeat and catch cancellation +@activity.defn +async def long_activity() -> None: + try: + for item in items: + activity.heartbeat() + await process(item) + except asyncio.CancelledError: + await cleanup() + raise +``` + ## Testing ### Not Testing Failures diff --git a/references/python/patterns.md b/references/python/patterns.md index 018103f..762977b 100644 --- a/references/python/patterns.md +++ b/references/python/patterns.md @@ -26,7 +26,7 @@ class OrderWorkflow: ### Dynamic Signal Handlers -For handling signals with names not known at compile time: +For handling signals with names not known at compile time. Use cases for this pattern are rare — most workflows should use statically defined signal handlers. ```python @workflow.defn @@ -75,6 +75,8 @@ class StatusWorkflow: ### Dynamic Query Handlers +For handling queries with names not known at compile time. Use cases for this pattern are rare — most workflows should use statically defined query handlers. + ```python @workflow.query(dynamic=True) def handle_query(self, name: str, args: Sequence[RawValue]) -> Any: @@ -295,10 +297,9 @@ class MyWorkflow: ## Waiting for All Handlers to Finish -### WHY: Ensure all signal/update handlers complete before workflow exits -### WHEN: -- **Workflows with async handlers** - Prevent data loss from in-flight handlers -- **Before continue-as-new** - Ensure handlers complete before resetting +Signal and update handlers should generally be non-async (avoid running activities from them). Otherwise, the workflow may complete before handlers finish their execution. However, making handlers non-async sometimes requires workarounds that add complexity. + +When async handlers are necessary, use `wait_condition(all_handlers_finished)` at the end of your workflow (or before continue-as-new) to prevent completion until all pending handlers complete. ```python @workflow.defn @@ -312,31 +313,47 @@ class MyWorkflow: return "done" ``` -## Activity Heartbeat Details - Updatable side-data usable in long-running activities +## Activity Heartbeat Details + +### WHY: +- **Support activity cancellation** - Cancellations are delivered via heartbeat; activities that don't heartbeat won't know they've been cancelled +- **Resume progress after worker failure** - Heartbeat details persist across retries + +**Cancellation exceptions:** +- Async activities: `asyncio.CancelledError` +- Sync threaded activities: `temporalio.exceptions.CancelledError` -### WHY: Resume activity progress after worker failure ### WHEN: +- **Cancellable activities** - Any activity that should respond to cancellation - **Long-running activities** - Track progress for resumability - **Checkpointing** - Save progress periodically ```python +from temporalio.exceptions import CancelledError + @activity.defn def process_large_file(file_path: str) -> str: # Get heartbeat details from previous attempt (if any) heartbeat_details = activity.info().heartbeat_details start_line = heartbeat_details[0] if heartbeat_details else 0 - with open(file_path) as f: - for i, line in enumerate(f): - if i < start_line: - continue # Skip already processed lines + try: + with open(file_path) as f: + for i, line in enumerate(f): + if i < start_line: + continue # Skip already processed lines - process_line(line) + process_line(line) - # Heartbeat with progress - activity.heartbeat(i + 1) + # Heartbeat with progress + # If cancelled, heartbeat() raises CancelledError + activity.heartbeat(i + 1) - return "completed" + return "completed" + except CancelledError: + # Perform cleanup on cancellation + cleanup() + raise ``` ## Timers diff --git a/references/python/python.md b/references/python/python.md index e79d849..130b1eb 100644 --- a/references/python/python.md +++ b/references/python/python.md @@ -151,7 +151,8 @@ with workflow.unsafe.imports_passed_through(): 4. **Forgetting to heartbeat** - Long activities need `activity.heartbeat()` 5. **Using gevent** - Incompatible with SDK 6. **Using `print()` in workflows** - Use `workflow.logger` instead for replay-safe logging -7. **Mixing Workflows and Activities in same file** - Causes unnecessary reloads, hurts performance, bad structure. +7. **Mixing Workflows and Activities in same file** - Causes unnecessary reloads, hurts performance, bad structure +8. **Forgetting to wait on activity calls** - `workflow.execute_activity()` is async; you must eventually await it (directly or via `asyncio.gather()` for parallel execution) ## Writing Tests diff --git a/references/python/versioning.md b/references/python/versioning.md index 029eaa5..abd4445 100644 --- a/references/python/versioning.md +++ b/references/python/versioning.md @@ -34,8 +34,14 @@ class ShippingWorkflow: - For replay with the marker: `patched()` returns `True` (history includes this patch) - For replay without the marker: `patched()` returns `False` (history predates this patch) +**Python-specific behavior:** The `patched()` return value is memoized on first call. This means you cannot reliably use `patched()` in loops—it will return the same value every iteration. Workaround: append a sequence number to the patch ID for each iteration (e.g., `f"my-change-{i}"`). + ### Three-Step Patching Process +Patching is a three-step process for safely deploying changes. + +**Warning:** Failing to follow this process correctly will result in non-determinism errors for in-flight workflows. + **Step 1: Patch in New Code** Add the patch with both old and new code paths: diff --git a/references/typescript/advanced-features.md b/references/typescript/advanced-features.md new file mode 100644 index 0000000..17b7e61 --- /dev/null +++ b/references/typescript/advanced-features.md @@ -0,0 +1,150 @@ +# TypeScript SDK Advanced Features + +## Schedules + +Create recurring workflow executions. + +```typescript +import { Client, ScheduleOverlapPolicy } from '@temporalio/client'; + +const client = new Client(); + +// Create a schedule +const schedule = await client.schedule.create({ + scheduleId: 'daily-report', + spec: { + intervals: [{ every: '1 day' }], + }, + action: { + type: 'startWorkflow', + workflowType: 'dailyReportWorkflow', + taskQueue: 'reports', + args: [], + }, + policies: { + overlap: ScheduleOverlapPolicy.SKIP, + }, +}); + +// Manage schedules +const handle = client.schedule.getHandle('daily-report'); +await handle.pause('Maintenance window'); +await handle.unpause(); +await handle.trigger(); // Run immediately +await handle.delete(); +``` + +## Async Activity Completion + +Complete an activity asynchronously from outside the activity function. Useful when the activity needs to wait for an external event. + +**In the activity - return the task token:** +```typescript +import { CompleteAsyncError, activityInfo } from '@temporalio/activity'; + +export async function doSomethingAsync(): Promise { + const taskToken: Uint8Array = activityInfo().taskToken; + setTimeout(() => doSomeWork(taskToken), 1000); + throw new CompleteAsyncError(); +} +``` + +**External completion (from another process, machine, etc.):** +```typescript +import { Client } from '@temporalio/client'; + +async function doSomeWork(taskToken: Uint8Array): Promise { + const client = new Client(); + // does some work... + await client.activity.complete(taskToken, "Job's done!"); +} +``` + +**When to use:** +- Waiting for human approval +- Waiting for external webhook callback +- Long-polling external systems + +## Worker Tuning + +Configure worker capacity for production workloads: + +```typescript +import { Worker, NativeConnection } from '@temporalio/worker'; + +const worker = await Worker.create({ + connection: await NativeConnection.connect({ address: 'temporal:7233' }), + taskQueue: 'my-queue', + workflowBundle: { codePath: require.resolve('./workflow-bundle.js') }, // Pre-bundled for production + activities, + + // Workflow execution concurrency (default: 40) + maxConcurrentWorkflowTaskExecutions: 100, + + // Activity execution concurrency (default: 100) + maxConcurrentActivityTaskExecutions: 200, + + // Graceful shutdown timeout (default: 0) + shutdownGraceTime: '30 seconds', + + // Max cached workflows (memory vs latency tradeoff) + maxCachedWorkflows: 1000, +}); +``` + +**Key settings:** +- `maxConcurrentWorkflowTaskExecutions`: Max workflows running simultaneously (default: 40) +- `maxConcurrentActivityTaskExecutions`: Max activities running simultaneously (default: 100) +- `shutdownGraceTime`: Time to wait for in-progress work before forced shutdown +- `maxCachedWorkflows`: Number of workflows to keep in cache (reduces replay on cache hit) + +## Sinks + +Sinks allow workflows to emit events for side effects (logging, metrics). + +```typescript +import { proxySinks, Sinks } from '@temporalio/workflow'; + +// Define sink interface +export interface LoggerSinks extends Sinks { + logger: { + info(message: string, attrs: Record): void; + error(message: string, attrs: Record): void; + }; +} + +// Use in workflow +const { logger } = proxySinks(); + +export async function myWorkflow(input: string): Promise { + logger.info('Workflow started', { input }); + + const result = await someActivity(input); + + logger.info('Workflow completed', { result }); + return result; +} + +// Implement sink in worker +const worker = await Worker.create({ + workflowsPath: require.resolve('./workflows'), // Use workflowBundle for production + activities, + taskQueue: 'my-queue', + sinks: { + logger: { + info: { + fn(workflowInfo, message, attrs) { + console.log(`[${workflowInfo.workflowId}] ${message}`, attrs); + }, + callDuringReplay: false, // Don't log during replay + }, + error: { + fn(workflowInfo, message, attrs) { + console.error(`[${workflowInfo.workflowId}] ${message}`, attrs); + }, + callDuringReplay: false, + }, + }, + }, +}); +``` diff --git a/references/typescript/data-handling.md b/references/typescript/data-handling.md new file mode 100644 index 0000000..bfd4925 --- /dev/null +++ b/references/typescript/data-handling.md @@ -0,0 +1,253 @@ +# TypeScript SDK Data Handling + +## Overview + +The TypeScript SDK uses data converters to serialize/deserialize workflow inputs, outputs, and activity parameters. + +## Default Data Converter + +The default converter handles: +- `undefined` and `null` +- `Uint8Array` (as binary) +- JSON-serializable types + +Note: Protobuf support requires using a data converter (`DefaultPayloadConverterWithProtobufs`). See the Protobuf Support section below. + +## Custom Data Converter + +Create custom converters for special serialization needs. + +```typescript +// payload-converter.ts +import { + PayloadConverter, + Payload, + defaultPayloadConverter, +} from '@temporalio/common'; + +class CustomPayloadConverter implements PayloadConverter { + toPayload(value: T): Payload | undefined { + // Custom serialization logic + return defaultPayloadConverter.toPayload(value); + } + + fromPayload(payload: Payload): T { + // Custom deserialization logic + return defaultPayloadConverter.fromPayload(payload); + } +} + +export const payloadConverter = new CustomPayloadConverter(); +``` + +```typescript +// client.ts +import { Client } from '@temporalio/client'; + +const client = new Client({ + dataConverter: { + payloadConverterPath: require.resolve('./payload-converter'), + }, +}); +``` + +```typescript +// worker.ts +import { Worker } from '@temporalio/worker'; + +const worker = await Worker.create({ + dataConverter: { + payloadConverterPath: require.resolve('./payload-converter'), + }, + // ... +}); +``` + +## Composition of Payload Converters + +```typescript +import { CompositePayloadConverter } from '@temporalio/common'; + +// The order matters — converters are tried in sequence until one returns a non-null Payload +export const payloadConverter = new CompositePayloadConverter( + new PayloadConverterFoo(), + new PayloadConverterBar(), +); +``` + +## Protobuf Support + +Using Protocol Buffers for type-safe serialization. + +**Note:** JSON serialization (the default) is preferred for TypeScript applications—it's simpler and more performant. Use Protobuf only when interoperating with services that require it. + +```typescript +import { DefaultPayloadConverterWithProtobufs } from '@temporalio/common/lib/protobufs'; + +const dataConverter: DataConverter = { + payloadConverter: new DefaultPayloadConverterWithProtobufs({ + protobufRoot: myProtobufRoot, + }), +}; +``` + +## Payload Codec (Encryption) + +Encrypt sensitive workflow data. + +```typescript +import { PayloadCodec, Payload } from '@temporalio/common'; + +class EncryptionCodec implements PayloadCodec { + private readonly encryptionKey: Uint8Array; + + constructor(key: Uint8Array) { + this.encryptionKey = key; + } + + async encode(payloads: Payload[]): Promise { + return Promise.all( + payloads.map(async (payload) => ({ + metadata: { + encoding: 'binary/encrypted', + }, + data: await this.encrypt(payload.data ?? new Uint8Array()), + })) + ); + } + + async decode(payloads: Payload[]): Promise { + return Promise.all( + payloads.map(async (payload) => { + if (payload.metadata?.encoding === 'binary/encrypted') { + return { + ...payload, + data: await this.decrypt(payload.data ?? new Uint8Array()), + }; + } + return payload; + }) + ); + } + + private async encrypt(data: Uint8Array): Promise { + // Implement encryption (e.g., using Web Crypto API) + return data; + } + + private async decrypt(data: Uint8Array): Promise { + // Implement decryption + return data; + } +} + +// Apply codec +const dataConverter: DataConverter = { + payloadCodecs: [new EncryptionCodec(encryptionKey)], +}; +``` + +## Search Attributes + +Custom searchable fields for workflow visibility. + +### Setting Search Attributes at Start + +```typescript +import { Client } from '@temporalio/client'; + +const client = new Client(); + +await client.workflow.start('orderWorkflow', { + taskQueue: 'orders', + workflowId: `order-${orderId}`, + args: [order], + searchAttributes: { + OrderId: [orderId], + CustomerType: ['premium'], + OrderTotal: [99.99], + CreatedAt: [new Date()], + }, +}); +``` + +### Upserting Search Attributes from Workflow + +```typescript +import { upsertSearchAttributes, workflowInfo } from '@temporalio/workflow'; + +export async function orderWorkflow(order: Order): Promise { + // Update status as workflow progresses + upsertSearchAttributes({ + OrderStatus: ['processing'], + }); + + await processOrder(order); + + upsertSearchAttributes({ + OrderStatus: ['completed'], + }); + + return 'done'; +} +``` + +### Reading Search Attributes + +```typescript +import { workflowInfo } from '@temporalio/workflow'; + +export async function orderWorkflow(): Promise { + const info = workflowInfo(); + const searchAttrs = info.searchAttributes; + const orderId = searchAttrs?.OrderId?.[0]; + // ... +} +``` + +### Querying Workflows by Search Attributes + +```typescript +const client = new Client(); + +// List workflows using search attributes +for await (const workflow of client.workflow.list({ + query: 'OrderStatus = "processing" AND CustomerType = "premium"', +})) { + console.log(`Workflow ${workflow.workflowId} is still processing`); +} +``` + +## Workflow Memo + +Store arbitrary metadata with workflows (not searchable). + +```typescript +// Set memo at workflow start +await client.workflow.start('orderWorkflow', { + taskQueue: 'orders', + workflowId: `order-${orderId}`, + args: [order], + memo: { + customerName: order.customerName, + notes: 'Priority customer', + }, +}); + +// Read memo from workflow +import { workflowInfo } from '@temporalio/workflow'; + +export async function orderWorkflow(): Promise { + const info = workflowInfo(); + const customerName = info.memo?.customerName; + // ... +} +``` + +## Best Practices + +1. Keep payloads small—see `references/core/gotchas.md` for limits +2. Use search attributes for business-level visibility and filtering +3. Encrypt sensitive data with PayloadCodec +4. Use memo for non-searchable metadata +5. Configure the same data converter on both client and worker diff --git a/references/typescript/determinism-protection.md b/references/typescript/determinism-protection.md new file mode 100644 index 0000000..54303ba --- /dev/null +++ b/references/typescript/determinism-protection.md @@ -0,0 +1,56 @@ +# TypeScript Workflow V8 Sandboxing + +## Overview + +The TypeScript SDK runs workflows in a V8 sandbox that provides automatic protection against non-deterministic operations, and replaces common non-deterministic function calls with deterministic variants. + +## Import Blocking + +The sandbox blocks imports of `fs`, `https` modules, and any Node/DOM APIs. Otherwise, workflow code can import any package as long as it does not reference Node.js or DOM APIs. + +**Note**: If you must use a library that references a Node.js or DOM API and you are certain that those APIs are not used at runtime, add that module to the `ignoreModules` list: + +```ts +const worker = await Worker.create({ + workflowsPath: require.resolve('./workflows'), // bundlerOptions only apply with workflowsPath + activities: require('./activities'), + taskQueue: 'my-task-queue', + bundlerOptions: { + // These modules may be imported (directly or transitively), + // but will be excluded from the Workflow bundle. + ignoreModules: ['fs', 'http', 'crypto'], + }, +}); +``` + +**Important**: Excluded modules are completely unavailable at runtime. Any attempt to call functions from these modules will throw an error. Only exclude modules when you are certain the code paths using them will never execute during workflow execution. + +**Note**: Modules with the `node:` prefix (e.g., `node:fs`) require additional webpack configuration to ignore. You may need to configure the bundler's `externals` or use webpack `resolve.alias` to handle these imports. + +Use this with *extreme caution*. + + +## Function Replacement + +Functions like `Math.random()`, `Date`, and `setTimeout()` are replaced by deterministic versions. + +Date-related functions return the timestamp at which the current workflow task was initially executed. That timestamp remains the same when the workflow task is replayed, and only advances when a durable operation occurs (like `sleep()`). For example: + +```ts +import { sleep } from '@temporalio/workflow'; + +// this prints the *exact* same timestamp repeatedly +for (let x = 0; x < 10; ++x) { + console.log(Date.now()); +} + +// this prints timestamps increasing roughly 1s each iteration +for (let x = 0; x < 10; ++x) { + await sleep('1 second'); + console.log(Date.now()); +} +``` + +Generally, this is the behavior you want. + +Additionally, `FinalizationRegistry` and `WeakRef` are removed because v8's garbage collector is not deterministic. diff --git a/references/typescript/determinism.md b/references/typescript/determinism.md new file mode 100644 index 0000000..47f8948 --- /dev/null +++ b/references/typescript/determinism.md @@ -0,0 +1,51 @@ +# TypeScript SDK Determinism + +## Overview + +The TypeScript SDK runs workflows in an isolated V8 sandbox that automatically provides determinism. + +## Why Determinism Matters + +Temporal provides durable execution through **History Replay**. When a Worker needs to restore workflow state (after a crash, cache eviction, or to continue after a long timer), it re-executes the workflow code from the beginning, which requires the workflow code to be **deterministic**. + +## Temporal's V8 Sandbox + +The Temporal TypeScript SDK executes all workflow code in sandbox, which (among other things), replaces common non-deterministic functions with deterministic variants. As an example, consider the code below: + +```ts +export async function myWorkflow(): Promise { + await importData(); + + if (Math.random() > 0.5) { + await sleep('30 minutes'); + } + + return await sendReport(); +} +``` + +The Temporal workflow sandbox will use the same random seed when replaying a workflow, so the above code will **deterministically** generate pseudo-random numbers. For UUIDs, use `uuid4()` from `@temporalio/workflow` which also uses the seeded PRNG. + +See `references/typescript/determinism-protection.md` for more information about the sandbox. + +## Forbidden Operations + +```typescript +// DO NOT do these in workflows: +import fs from 'fs'; // Node.js modules +fetch('https://...'); // Network I/O +``` + +Most non-determinism and side effects, such as the above, should be wrapped in Activities. + +## Testing Replay Compatibility + +Use `Worker.runReplayHistory()` to verify your code changes are compatible with existing histories. See the Workflow Replay Testing section of `references/typescript/testing.md`. + +## Best Practices + +1. Use type-only imports for activities in workflow files +2. Match all @temporalio package versions +3. Prefer `sleep()` from workflow package — `setTimeout` works but `sleep()` handles cancellation scopes more clearly +4. Keep workflows focused on orchestration +5. Test with replay to verify determinism diff --git a/references/typescript/error-handling.md b/references/typescript/error-handling.md new file mode 100644 index 0000000..7072fbd --- /dev/null +++ b/references/typescript/error-handling.md @@ -0,0 +1,119 @@ +# TypeScript SDK Error Handling + +## Overview + +The TypeScript SDK uses `ApplicationFailure` for application errors with support for non-retryable marking. + +## Application Failures + +```typescript +import { ApplicationFailure } from '@temporalio/workflow'; + +export async function myWorkflow(): Promise { + throw ApplicationFailure.create({ + message: 'Invalid input', + type: 'ValidationError', + nonRetryable: true, + }); +} +``` + +## Activity Errors + +```typescript +import { ApplicationFailure } from '@temporalio/activity'; + +export async function validateActivity(input: string): Promise { + if (!isValid(input)) { + throw ApplicationFailure.create({ + message: `Invalid input: ${input}`, + type: 'ValidationError', + nonRetryable: true, + }); + } +} +``` + +## Handling Errors in Workflows + +```typescript +import { proxyActivities, ApplicationFailure, log } from '@temporalio/workflow'; +import type * as activities from './activities'; + +const { riskyActivity } = proxyActivities({ + startToCloseTimeout: '5 minutes', +}); + +export async function workflowWithErrorHandling(): Promise { + try { + return await riskyActivity(); + } catch (err) { + if (err instanceof ApplicationFailure) { + log.warn('Activity failed', { type: err.type, message: err.message }); + } + throw err; + } +} +``` + +## Retry Configuration + +```typescript +const { myActivity } = proxyActivities({ + startToCloseTimeout: '10 minutes', + retry: { + initialInterval: '1s', + backoffCoefficient: 2, + maximumInterval: '1m', + maximumAttempts: 5, + nonRetryableErrorTypes: ['ValidationError', 'PaymentError'], + }, +}); +``` + +**Note:** Only set retry options if you have a domain-specific reason to. The defaults are suitable for most use cases. + +## Timeout Configuration + +```typescript +const { myActivity } = proxyActivities({ + startToCloseTimeout: '5 minutes', // Single attempt + scheduleToCloseTimeout: '30 minutes', // Including retries + heartbeatTimeout: '30 seconds', // Between heartbeats +}); +``` + +## Workflow Failure + +Workflows can throw errors to indicate failure: + +```typescript +import { ApplicationFailure } from '@temporalio/workflow'; + +export async function myWorkflow(): Promise { + if (someCondition) { + throw ApplicationFailure.create({ + message: 'Workflow failed due to invalid state', + type: 'InvalidStateError', + }); + } + return 'success'; +} +``` + +**Warning:** Do NOT use `nonRetryable: true` for workflow failures in most cases. Unlike activities, workflow retries are controlled by the caller, not retry policies. Use `nonRetryable` only for errors that are truly unrecoverable (e.g., invalid input that will never be valid). + +## Idempotency + +For idempotency patterns (using keys, making activities granular), see `core/patterns.md`. + +## Best Practices + +1. Use specific error types for different failure modes +2. Set `nonRetryable: true` for permanent failures in activities +3. Configure `nonRetryableErrorTypes` in retry policy +4. Log errors before re-raising +5. Use `ApplicationFailure` to catch activity failures in workflows +6. Use the appropriate `log` import for your context: + - In workflows: `import { log } from '@temporalio/workflow'` (replay-safe) + - In activities: `import { log } from '@temporalio/activity'` diff --git a/references/typescript/gotchas.md b/references/typescript/gotchas.md new file mode 100644 index 0000000..d234f74 --- /dev/null +++ b/references/typescript/gotchas.md @@ -0,0 +1,312 @@ +# TypeScript Gotchas + +TypeScript-specific mistakes and anti-patterns. See also [Common Gotchas](../core/gotchas.md) for language-agnostic concepts. + +## Activity Imports + +### Importing Implementations Instead of Types + +**The Problem**: Importing activity implementations brings Node.js code into the V8 workflow sandbox, causing bundling errors or runtime failures. + +```typescript +// BAD - Brings actual code into workflow sandbox +import * as activities from './activities'; + +const { greet } = proxyActivities({ + startToCloseTimeout: '1 minute', +}); + +// GOOD - Type-only import +import type * as activities from './activities'; + +const { greet } = proxyActivities({ + startToCloseTimeout: '1 minute', +}); +``` + +### Importing Node.js Modules in Workflows + +```typescript +// BAD - fs is not available in workflow sandbox +import * as fs from 'fs'; + +export async function myWorkflow(): Promise { + const data = fs.readFileSync('file.txt'); // Will fail! +} + +// GOOD - File I/O belongs in activities +export async function myWorkflow(): Promise { + const data = await activities.readFile('file.txt'); +} +``` + +## Bundling Issues + +### Using workflowsPath in Production + +`workflowsPath` runs the bundler at Worker startup, which is slow and not suitable for production. Use `workflowBundle` with pre-bundled code instead. + +```typescript +// OK for development/testing, BAD for production - bundles at startup +const worker = await Worker.create({ + workflowsPath: require.resolve('./workflows'), + // ... +}); + +// GOOD for production - use pre-bundled code +import { bundleWorkflowCode } from '@temporalio/worker'; + +// Build step (run once at build time) +const bundle = await bundleWorkflowCode({ + workflowsPath: require.resolve('./workflows'), +}); +await fs.promises.writeFile('./workflow-bundle.js', bundle.code); + +// Worker startup (fast, no bundling) +const worker = await Worker.create({ + workflowBundle: { + codePath: require.resolve('./workflow-bundle.js'), + }, + // ... +}); +``` + +### Missing Dependencies in Workflow Bundle + +```typescript +// If using external packages in workflows, ensure they're bundled + +// worker.ts +const worker = await Worker.create({ + workflowsPath: require.resolve('./workflows'), + bundlerOptions: { + // Exclude Node.js-only packages that cause bundling errors + // WARNING: Modules listed here will be completely unavailable + // at workflow runtime - any imports will fail + ignoreModules: ['some-node-only-package'], + }, +}); +``` + +### Package Version Mismatches + +All `@temporalio/*` packages must have the same version. This can be verified by running `npm ls` or the appropriate command for your package manager. + +### Package Version Constraints - Prod vs. Non-Prod + +For production apps, you should use ~ version constraints (bug fixes only) on Temporal packages. For non-production apps, you may use ^ constraints (the npm default) instead. + +## Wrong Retry Classification + +A common mistake is treating transient errors as permanent (or vice versa): + +- **Transient errors** (retry): network timeouts, temporary service unavailability, rate limits +- **Permanent errors** (don't retry): invalid input, authentication failure, resource not found + +```typescript +// BAD: Retrying a permanent error +throw ApplicationFailure.create({ message: 'User not found' }); +// This will retry indefinitely! + +// GOOD: Mark permanent errors as non-retryable +throw ApplicationFailure.nonRetryable('User not found'); +``` + +For detailed guidance on error classification and retry policies, see `error-handling.md`. + +## Cancellation + +### Not Handling Workflow Cancellation + +```typescript +// BAD - Cleanup doesn't run on cancellation +export async function workflowWithCleanup(): Promise { + await activities.acquireResource(); + await activities.doWork(); + await activities.releaseResource(); // Never runs if cancelled! +} + +// GOOD - Use CancellationScope for cleanup +import { CancellationScope } from '@temporalio/workflow'; + +export async function workflowWithCleanup(): Promise { + await activities.acquireResource(); + try { + await activities.doWork(); + } finally { + // Run cleanup even on cancellation + await CancellationScope.nonCancellable(async () => { + await activities.releaseResource(); + }); + } +} +``` + +### Not Handling Activity Cancellation + +Activities must **opt in** to receive cancellation. This requires: +1. **Heartbeating** - Cancellation is delivered via heartbeat +2. **Checking for cancellation** - Either await `Context.current().cancelled` or use `cancellationSignal()` + +```typescript +// BAD - Activity ignores cancellation +export async function longActivity(): Promise { + await doExpensiveWork(); // Runs to completion even if cancelled +} +``` + +```typescript +// GOOD - Heartbeat in background and race work against cancellation promise +import { Context, CancelledFailure } from '@temporalio/activity'; + +export async function longActivity(): Promise { + // Heartbeat in background so cancellation can be delivered + let heartbeatEnabled = true; + (async () => { + while (heartbeatEnabled) { + await Context.current().sleep(5000); + Context.current().heartbeat(); + } + })().catch(() => {}); + + try { + await Promise.race([ + Context.current().cancelled, // Rejects with CancelledFailure + doExpensiveWork(), + ]); + } catch (err) { + if (err instanceof CancelledFailure) { + await cleanup(); + } + throw err; + } finally { + heartbeatEnabled = false; + } +} +``` + +```typescript +// GOOD - Use AbortSignal with libraries that support it +import fetch from 'node-fetch'; +import { cancellationSignal, heartbeat } from '@temporalio/activity'; +import type { AbortSignal as FetchAbortSignal } from 'node-fetch/externals'; + +export async function cancellableFetch(url: string): Promise { + const response = await fetch(url, { signal: cancellationSignal() as FetchAbortSignal }); + + const contentLength = parseInt(response.headers.get('Content-Length')!); + let bytesRead = 0; + const chunks: Buffer[] = []; + + for await (const chunk of response.body) { + if (!(chunk instanceof Buffer)) throw new TypeError('Expected Buffer'); + bytesRead += chunk.length; + chunks.push(chunk); + heartbeat(bytesRead / contentLength); // Heartbeat to keep cancellation delivery alive + } + return Buffer.concat(chunks); +} +``` + +**Note:** `Promise.race` doesn't stop the losing promise—it continues running. Use `cancellationSignal()` or explicitly abort sub-operations when cleanup requires stopping in-flight work. + +## Heartbeating + +### Forgetting to Heartbeat Long Activities + +```typescript +// BAD - No heartbeat, can't detect stuck activities +export async function processLargeFile(path: string): Promise { + for await (const chunk of readChunks(path)) { + await processChunk(chunk); // Takes hours, no heartbeat + } +} + +// GOOD - Regular heartbeats with progress +import { heartbeat } from '@temporalio/activity'; + +export async function processLargeFile(path: string): Promise { + let i = 0; + for await (const chunk of readChunks(path)) { + heartbeat(`Processing chunk ${i++}`); + await processChunk(chunk); + } +} +``` + +### Heartbeat Timeout Too Short + +```typescript +// BAD - Heartbeat timeout shorter than processing time +const { processChunk } = proxyActivities({ + startToCloseTimeout: '30 minutes', + heartbeatTimeout: '10 seconds', // Too short! +}); + +// GOOD - Heartbeat timeout allows for processing variance +const { processChunk } = proxyActivities({ + startToCloseTimeout: '30 minutes', + heartbeatTimeout: '2 minutes', +}); +``` + +Set heartbeat timeout as high as acceptable for your use case — each heartbeat counts as an action. + +## Testing + +### Not Testing Failures + +```typescript +import { TestWorkflowEnvironment } from '@temporalio/testing'; +import { Worker } from '@temporalio/worker'; + +test('handles activity failure', async () => { + const env = await TestWorkflowEnvironment.createTimeSkipping(); + + const worker = await Worker.create({ + connection: env.nativeConnection, + taskQueue: 'test', + workflowsPath: require.resolve('./workflows'), + activities: { + // Activity that always fails + riskyOperation: async () => { + throw ApplicationFailure.nonRetryable('Simulated failure'); + }, + }, + }); + + await worker.runUntil(async () => { + await expect( + env.client.workflow.execute(riskyWorkflow, { + workflowId: 'test-failure', + taskQueue: 'test', + }) + ).rejects.toThrow('Simulated failure'); + }); + + await env.teardown(); +}); +``` + +### Not Testing Replay + +```typescript +import { Worker } from '@temporalio/worker'; +import * as fs from 'fs'; + +test('replay compatibility', async () => { + const history = JSON.parse(await fs.promises.readFile('./fixtures/workflow_history.json', 'utf8')); + + // Fails if current code is incompatible with history + await Worker.runReplayHistory( + { + workflowsPath: require.resolve('./workflows'), + }, + history, + ); +}); +``` + +## Timers and Sleep + +`setTimeout` works in workflows (the SDK mocks it), but `sleep()` from `@temporalio/workflow` is preferred because its interaction with cancellation scopes is more intuitive. See Timers in `references/typescript/patterns.md`. diff --git a/references/typescript/observability.md b/references/typescript/observability.md new file mode 100644 index 0000000..10244d7 --- /dev/null +++ b/references/typescript/observability.md @@ -0,0 +1,109 @@ +# TypeScript SDK Observability + +## Overview + +The TypeScript SDK provides replay-aware logging, metrics, and integrations for production observability. + +## Replay-Aware Logging + +Temporal's logger automatically suppresses duplicate messages during replay, preventing log spam when workflows recover state. + +### Workflow Logging + +Workflows run in a sandboxed environment and cannot use regular Node.js loggers directly. Since SDK 1.8.0, the `@temporalio/workflow` package exports a `log` object that provides replay-aware logging. Internally, it uses Sinks to funnel messages to the Runtime's logger. + +```typescript +import { log } from '@temporalio/workflow'; + +export async function orderWorkflow(orderId: string): Promise { + log.info('Processing order', { orderId }); + + const result = await processPayment(orderId); + log.debug('Payment processed', { orderId, result }); + + return result; +} +``` + +**Log levels**: `log.debug()`, `log.info()`, `log.warn()`, `log.error()` + +The workflow logger automatically suppresses duplicate messages during replay and includes workflow context metadata (workflowId, runId, etc.) on every log entry. + +### Activity Logging + +```typescript +import { log } from '@temporalio/activity'; + +export async function processPayment(orderId: string): Promise { + log.info('Processing payment', { orderId }); + return 'payment-id-123'; +} +``` + +The activity logger adds contextual metadata (activity ID, type, namespace) and funnels messages to the runtime's logger for consistent collection. + +## Customizing the Logger + +### Basic Configuration + +```typescript +import { DefaultLogger, Runtime } from '@temporalio/worker'; + +const logger = new DefaultLogger('DEBUG', ({ level, message }) => { + console.log(`Custom logger: ${level} - ${message}`); +}); +Runtime.install({ logger }); +``` + +### Winston Integration + +```typescript +import winston from 'winston'; +import { DefaultLogger, Runtime } from '@temporalio/worker'; + +const winstonLogger = winston.createLogger({ + level: 'debug', + format: winston.format.json(), + transports: [ + new winston.transports.File({ filename: 'temporal.log' }) + ], +}); + +const logger = new DefaultLogger('DEBUG', (entry) => { + winstonLogger.log({ + label: entry.meta?.activityId ? 'activity' : entry.meta?.workflowId ? 'workflow' : 'worker', + level: entry.level.toLowerCase(), + message: entry.message, + timestamp: Number(entry.timestampNanos / 1_000_000n), + ...entry.meta, + }); +}); + +Runtime.install({ logger }); +``` + +## Metrics + +### Prometheus Metrics + +```typescript +import { Runtime } from '@temporalio/worker'; + +Runtime.install({ + telemetryOptions: { + metrics: { + prometheus: { + bindAddress: '127.0.0.1:9091', + }, + }, + }, +}); +``` + +## Best Practices + +1. Use `log` from `@temporalio/workflow` for production observability. For temporary print debugging, `console.log()` is fine—it's direct and immediate, whereas `log` goes through sinks which may lose messages on workflow errors +2. Include correlation IDs (orderId, customerId) in log messages +3. Configure Winston or similar for production log aggregation +4. Monitor Prometheus metrics for worker health +5. Use Event History for debugging workflow issues diff --git a/references/typescript/patterns.md b/references/typescript/patterns.md new file mode 100644 index 0000000..878f9f0 --- /dev/null +++ b/references/typescript/patterns.md @@ -0,0 +1,412 @@ +# TypeScript SDK Patterns + +## Signals + +```typescript +import { defineSignal, setHandler, condition } from '@temporalio/workflow'; + +const approveSignal = defineSignal<[boolean]>('approve'); +const addItemSignal = defineSignal<[string]>('addItem'); + +export async function orderWorkflow(): Promise { + let approved = false; + const items: string[] = []; + + setHandler(approveSignal, (value) => { + approved = value; + }); + + setHandler(addItemSignal, (item) => { + items.push(item); + }); + + await condition(() => approved); + return `Processed ${items.length} items`; +} +``` + +## Dynamic Signal Handlers + +For handling signals with names not known at compile time. Use cases for this pattern are rare — most workflows should use statically defined signal handlers. + +```typescript +import { setDefaultSignalHandler, condition } from '@temporalio/workflow'; + +export async function dynamicSignalWorkflow(): Promise> { + const signals: Record = {}; + + setDefaultSignalHandler((signalName: string, ...args: unknown[]) => { + if (!signals[signalName]) { + signals[signalName] = []; + } + signals[signalName].push(args); + }); + + await condition(() => signals['done'] !== undefined); + return signals; +} +``` + +## Queries + +**Important:** Queries must NOT modify workflow state or have side effects. + +```typescript +import { defineQuery, setHandler } from '@temporalio/workflow'; + +const statusQuery = defineQuery('status'); +const progressQuery = defineQuery('progress'); + +export async function progressWorkflow(): Promise { + let status = 'running'; + let progress = 0; + + setHandler(statusQuery, () => status); + setHandler(progressQuery, () => progress); + + for (let i = 0; i < 100; i++) { + progress = i; + await doWork(); + } + status = 'completed'; +} +``` + +## Dynamic Query Handlers + +For handling queries with names not known at compile time. Use cases for this pattern are rare — most workflows should use statically defined query handlers. + +```typescript +import { setDefaultQueryHandler } from '@temporalio/workflow'; + +export async function dynamicQueryWorkflow(): Promise { + const state: Record = { + status: 'running', + progress: 0, + }; + + setDefaultQueryHandler((queryName: string) => { + return state[queryName]; + }); + + // ... workflow logic +} +``` + +## Updates + +```typescript +import { defineUpdate, setHandler, condition } from '@temporalio/workflow'; + +// Define the update - specify return type and argument types +export const addItemUpdate = defineUpdate('addItem'); +export const addItemValidatedUpdate = defineUpdate('addItemValidated'); + +export async function orderWorkflow(): Promise { + const items: string[] = []; + let completed = false; + + // Simple update handler - returns new item count + setHandler(addItemUpdate, (item: string) => { + items.push(item); + return items.length; + }); + + // Update handler with validator - rejects invalid input before execution + setHandler( + addItemValidatedUpdate, + (item: string) => { + items.push(item); + return items.length; + }, + { + validator: (item: string) => { + if (!item) throw new Error('Item cannot be empty'); + if (items.length >= 100) throw new Error('Order is full'); + }, + } + ); + + await condition(() => completed); + return `Order with ${items.length} items completed`; +} +``` + +## Child Workflows + +```typescript +import { executeChild } from '@temporalio/workflow'; + +export async function parentWorkflow(orders: Order[]): Promise { + const results: string[] = []; + + for (const order of orders) { + const result = await executeChild(processOrderWorkflow, { + args: [order], + workflowId: `order-${order.id}`, + }); + results.push(result); + } + + return results; +} +``` + +### Child Workflow Options + +```typescript +import { executeChild, ParentClosePolicy, ChildWorkflowCancellationType } from '@temporalio/workflow'; + +const result = await executeChild(childWorkflow, { + args: [input], + workflowId: `child-${workflowInfo().workflowId}`, + + // ParentClosePolicy - what happens to child when parent closes + // TERMINATE (default), ABANDON, REQUEST_CANCEL + parentClosePolicy: ParentClosePolicy.TERMINATE, + + // ChildWorkflowCancellationType - how cancellation is handled + // WAIT_CANCELLATION_COMPLETED (default), WAIT_CANCELLATION_REQUESTED, TRY_CANCEL, ABANDON + cancellationType: ChildWorkflowCancellationType.WAIT_CANCELLATION_COMPLETED, +}); +``` + +## Handles to External Workflows + +```typescript +import { getExternalWorkflowHandle } from '@temporalio/workflow'; +import { mySignal } from './other-workflows'; + +export async function coordinatorWorkflow(targetWorkflowId: string): Promise { + const handle = getExternalWorkflowHandle(targetWorkflowId); + + // Signal the external workflow + await handle.signal(mySignal, { data: 'payload' }); + + // Or cancel it + await handle.cancel(); +} +``` + +## Parallel Execution + +```typescript +export async function parallelWorkflow(items: string[]): Promise { + return await Promise.all( + items.map((item) => processItem(item)) + ); +} +``` + +## Continue-as-New + +```typescript +import { continueAsNew, workflowInfo } from '@temporalio/workflow'; + +export async function longRunningWorkflow(state: State): Promise { + while (true) { + state = await processNextBatch(state); + + if (state.isComplete) { + return 'done'; + } + + const info = workflowInfo(); + if (info.continueAsNewSuggested || info.historyLength > 10000) { + await continueAsNew(state); + } + } +} +``` + +## Saga Pattern + +**Important:** Compensation activities should be idempotent. + +```typescript +import { log } from '@temporalio/workflow'; + +export async function sagaWorkflow(order: Order): Promise { + const compensations: Array<() => Promise> = []; + + try { + // IMPORTANT: Save compensation BEFORE calling the activity + // If activity fails after completing but before returning, + // compensation must still be registered + await reserveInventory(order); + compensations.push(() => releaseInventory(order)); + + await chargePayment(order); + compensations.push(() => refundPayment(order)); + + await shipOrder(order); + return 'Order completed'; + } catch (err) { + for (const compensate of compensations.reverse()) { + try { + await compensate(); + } catch (compErr) { + log.warn('Compensation failed', { error: compErr }); + } + } + throw err; + } +} +``` + +## Cancellation Scopes + +Cancellation scopes control how cancellation propagates to activities and child workflows. Use them for cleanup logic, timeouts, and manual cancellation. + +```typescript +import { CancellationScope, sleep } from '@temporalio/workflow'; + +export async function scopedWorkflow(): Promise { + // Non-cancellable scope - runs even if workflow cancelled + await CancellationScope.nonCancellable(async () => { + await cleanupActivity(); + }); + + // Timeout scope + await CancellationScope.withTimeout('5 minutes', async () => { + await longRunningActivity(); + }); + + // Manual cancellation + const scope = new CancellationScope(); + const promise = scope.run(() => someActivity()); + scope.cancel(); +} +``` + +## Triggers (Promise-like Signals) + +**WHY**: Triggers provide a one-shot promise that resolves when a signal is received. Cleaner than condition() for single-value signals. + +**WHEN to use**: +- Waiting for a single response (approval, completion notification) +- Converting signal-based events into awaitable promises + +```typescript +import { Trigger } from '@temporalio/workflow'; + +export async function triggerWorkflow(): Promise { + const approvalTrigger = new Trigger(); + + setHandler(approveSignal, (approved) => { + approvalTrigger.resolve(approved); + }); + + const approved = await approvalTrigger; + return approved ? 'Approved' : 'Rejected'; +} +``` + +## Wait Condition with Timeout + +```typescript +import { condition, CancelledFailure } from '@temporalio/workflow'; + +export async function approvalWorkflow(): Promise { + let approved = false; + + setHandler(approveSignal, () => { + approved = true; + }); + + // Wait for approval with 24-hour timeout + const gotApproval = await condition(() => approved, '24 hours'); + + if (gotApproval) { + return 'approved'; + } else { + return 'auto-rejected due to timeout'; + } +} +``` + +## Waiting for All Handlers to Finish + +Signal and update handlers should generally be non-async (avoid running activities from them). Otherwise, the workflow may complete before handlers finish their execution. However, making handlers non-async sometimes requires workarounds that add complexity. + +When async handlers are necessary, use `condition(allHandlersFinished)` at the end of your workflow (or before continue-as-new) to prevent completion until all pending handlers complete. + +```typescript +import { condition, allHandlersFinished } from '@temporalio/workflow'; + +export async function handlerAwareWorkflow(): Promise { + // ... main workflow logic ... + + // Before exiting, wait for all handlers to finish + await condition(allHandlersFinished); + return 'done'; +} +``` + +## Activity Heartbeat Details + +### WHY: +- **Support activity cancellation** - Cancellations are delivered via heartbeat; activities that don't heartbeat won't know they've been cancelled +- **Resume progress after worker failure** - Heartbeat details persist across retries + +### WHEN: +- **Cancellable activities** - Any activity that should respond to cancellation +- **Long-running activities** - Track progress for resumability +- **Checkpointing** - Save progress periodically + +```typescript +import { heartbeat, activityInfo, CancelledFailure } from '@temporalio/activity'; + +export async function processLargeFile(filePath: string): Promise { + const info = activityInfo(); + // Get heartbeat details from previous attempt (if any) + const startLine: number = info.heartbeatDetails ?? 0; + + const lines = await readFileLines(filePath); + + try { + for (let i = startLine; i < lines.length; i++) { + await processLine(lines[i]); + // Heartbeat with progress + // If activity is cancelled, heartbeat() throws CancelledFailure + heartbeat(i + 1); + } + return 'completed'; + } catch (e) { + if (e instanceof CancelledFailure) { + // Perform cleanup on cancellation + await cleanup(); + } + throw e; + } +} +``` + +## Timers + +```typescript +import { sleep } from '@temporalio/workflow'; + +export async function timerWorkflow(): Promise { + await sleep('1 hour'); + return 'Timer fired'; +} +``` + +## Local Activities + +**Purpose**: Reduce latency for short, lightweight operations by skipping the task queue. ONLY use these when necessary for performance. Do NOT use these by default, as they are not durable and distributed. + +```typescript +import { proxyLocalActivities } from '@temporalio/workflow'; +import type * as activities from './activities'; + +const { quickLookup } = proxyLocalActivities({ + startToCloseTimeout: '5 seconds', +}); + +export async function localActivityWorkflow(): Promise { + const result = await quickLookup('key'); + return result; +} +``` diff --git a/references/typescript/testing.md b/references/typescript/testing.md new file mode 100644 index 0000000..e945ed8 --- /dev/null +++ b/references/typescript/testing.md @@ -0,0 +1,222 @@ +# TypeScript SDK Testing + +## Overview + +The TypeScript SDK provides `TestWorkflowEnvironment` for testing workflows with time-skipping and activity mocking support. Use `createTimeSkipping()` for automatic time advancement when testing workflows with timers, or `createLocal()` for a full local server without time-skipping. + +**Note:** Prefer to use `createLocal()` for full-featured support. Only use `createTimeSkipping()` if you genuinely need time skipping for testing your workflow. + +## Test Environment Setup + +```typescript +import { TestWorkflowEnvironment } from '@temporalio/testing'; +import { Worker } from '@temporalio/worker'; + +describe('Workflow', () => { + let testEnv: TestWorkflowEnvironment; + + before(async () => { + testEnv = await TestWorkflowEnvironment.createLocal(); + }); + + after(async () => { + await testEnv?.teardown(); + }); + + it('runs workflow', async () => { + const { client, nativeConnection } = testEnv; + + const worker = await Worker.create({ + connection: nativeConnection, + taskQueue: 'test', + workflowsPath: require.resolve('./workflows'), + activities: require('./activities'), + }); + + await worker.runUntil(async () => { + const result = await client.workflow.execute(greetingWorkflow, { + taskQueue: 'test', + workflowId: 'test-workflow', + args: ['World'], + }); + expect(result).toEqual('Hello, World!'); + }); + }); +}); +``` + +## Activity Mocking + +```typescript +const worker = await Worker.create({ + connection: nativeConnection, + taskQueue: 'test', + workflowsPath: require.resolve('./workflows'), + activities: { + // Mock activity implementation + greet: async (name: string) => `Mocked: ${name}`, + }, +}); +``` + +## Testing Signals and Queries + +```typescript +import { defineQuery, defineSignal } from '@temporalio/workflow'; + +// Define query and signal (typically in a shared file) +const getStatusQuery = defineQuery('getStatus'); +const approveSignal = defineSignal('approve'); + +it('handles signals and queries', async () => { + await worker.runUntil(async () => { + const handle = await client.workflow.start(approvalWorkflow, { + taskQueue: 'test', + workflowId: 'approval-test', + }); + + // Query current state + const status = await handle.query(getStatusQuery); + expect(status).toEqual('pending'); + + // Send signal + await handle.signal(approveSignal); + + // Wait for completion + const result = await handle.result(); + expect(result).toEqual('Approved!'); + }); +}); +``` + +## Testing Failure Cases + +Test that workflows handle errors correctly: + +```typescript +import { TestWorkflowEnvironment } from '@temporalio/testing'; +import { Worker } from '@temporalio/worker'; +import { WorkflowFailedError } from '@temporalio/client'; +import assert from 'assert'; + +describe('Failure handling', () => { + let testEnv: TestWorkflowEnvironment; + + before(async () => { + testEnv = await TestWorkflowEnvironment.createLocal(); + }); + + after(async () => { + await testEnv?.teardown(); + }); + + it('handles activity failure', async () => { + const { client, nativeConnection } = testEnv; + + const worker = await Worker.create({ + connection: nativeConnection, + taskQueue: 'test', + workflowsPath: require.resolve('./workflows'), + activities: { + // Mock activity that always fails + myActivity: async () => { + throw new Error('Activity failed'); + }, + }, + }); + + await worker.runUntil(async () => { + try { + await client.workflow.execute(myWorkflow, { + workflowId: 'test-failure', + taskQueue: 'test', + }); + assert.fail('Expected workflow to fail'); + } catch (err) { + assert(err instanceof WorkflowFailedError); + } + }); + }); +}); +``` + +## Replay Testing + +```typescript +import { Worker } from '@temporalio/worker'; +import { Client, Connection } from '@temporalio/client'; +import fs from 'fs'; + +describe('Replay', () => { + it('replays workflow history from JSON file', async () => { + // Load history from a JSON file (exported from Web UI or Temporal CLI) + const filePath = './history_file.json'; + const history = JSON.parse(await fs.promises.readFile(filePath, 'utf8')); + + await Worker.runReplayHistory( + { + workflowsPath: require.resolve('./workflows'), + }, + history, + 'my-workflow-id' // Optional: provide workflowId if your workflow depends on it + ); + }); + + it('replays workflow history from server', async () => { + // Fetch history programmatically using the client + const connection = await Connection.connect({ address: 'localhost:7233' }); + const client = new Client({ connection, namespace: 'default' }); + const handle = client.workflow.getHandle('my-workflow-id'); + const history = await handle.fetchHistory(); + + await Worker.runReplayHistory( + { + workflowsPath: require.resolve('./workflows'), + }, + history, + 'my-workflow-id' + ); + }); +}); +``` + +## Activity Testing + +Test activities in isolation without running a workflow: + +```typescript +import { MockActivityEnvironment } from '@temporalio/testing'; +import { CancelledFailure } from '@temporalio/activity'; +import { myActivity } from './activities'; +import assert from 'assert'; + +describe('Activity tests', () => { + it('completes successfully', async () => { + const env = new MockActivityEnvironment(); + const result = await env.run(myActivity, 'input'); + assert.equal(result, 'expected output'); + }); + + it('handles cancellation', async () => { + const env = new MockActivityEnvironment(); + // Cancel the activity after a short delay + setTimeout(() => env.cancel(), 100); + try { + await env.run(longRunningActivity, 'input'); + assert.fail('Expected cancellation'); + } catch (err) { + assert(err instanceof CancelledFailure); + } + }); +}); +``` + +**Note:** `MockActivityEnvironment` provides `heartbeat()` and cancellation support for testing activity behavior. + +## Best Practices + +1. Use time-skipping for workflows with timers +2. Mock external dependencies in activities +3. Test replay compatibility when changing workflow code +4. Use unique workflow IDs per test +5. Clean up test environment after tests diff --git a/references/typescript/typescript.md b/references/typescript/typescript.md new file mode 100644 index 0000000..9918ee7 --- /dev/null +++ b/references/typescript/typescript.md @@ -0,0 +1,172 @@ +# Temporal TypeScript SDK Reference + +## Overview + +The Temporal TypeScript SDK provides a modern Promise based approach to building durable workflows. Workflows are bundled and run in an isolated runtime with automatic replacements for determinism protection. + +**CRITICAL**: All `@temporalio/*` packages must have the same version number. + +## Understanding Replay + +Temporal workflows are durable through history replay. For details on how this works, see `references/core/determinism.md`. + +## Quick Start + +**Add Dependencies:** Install the Temporal SDK packages (use the package manager appropriate for your project): +```bash +npm install @temporalio/client @temporalio/worker @temporalio/workflow @temporalio/activity +``` + +Note: if you are working in production, it is strongly advised to use ~ version constraints, i.e. `npm install ... --save-prefix='~'` if using NPM. + +**activities.ts** - Activity definitions (separate file to distinguish workflow vs activity code): +```typescript +export async function greet(name: string): Promise { + return `Hello, ${name}!`; +} +``` + +**workflows.ts** - Workflow definition (use type-only imports for activities): +```typescript +import { proxyActivities } from '@temporalio/workflow'; +import type * as activities from './activities'; + +const { greet } = proxyActivities({ + startToCloseTimeout: '1 minute', +}); + +export async function greetingWorkflow(name: string): Promise { + return await greet(name); +} +``` + +**worker.ts** - Worker setup (imports activities and workflows, runs indefinitely): +```typescript +import { Worker } from '@temporalio/worker'; +import * as activities from './activities'; + +async function run() { + const worker = await Worker.create({ + workflowsPath: require.resolve('./workflows'), // For production, use workflowBundle instead + activities, + taskQueue: 'greeting-queue', + }); + await worker.run(); +} + +run().catch(console.error); +``` + +**Start the dev server:** Start `temporal server start-dev` in the background. + +**Start the worker:** Run `npx ts-node worker.ts` in the background. + +**client.ts** - Start a workflow execution: +```typescript +import { Client } from '@temporalio/client'; +import { greetingWorkflow } from './workflows'; +import { v4 as uuid } from 'uuid'; + +async function run() { + const client = new Client(); + + const result = await client.workflow.execute(greetingWorkflow, { + workflowId: uuid(), + taskQueue: 'greeting-queue', + args: ['my name'], + }); + + console.log(`Result: ${result}`); +} + +run().catch(console.error); +``` + +**Run the workflow:** Run `npx ts-node client.ts`. Should output: `Result: Hello, my name!`. + +## Key Concepts + +### Workflow Definition +- Async functions exported from workflow file +- Use `proxyActivities()` with type-only imports +- Use `defineSignal()`, `defineQuery()`, `defineUpdate()`, `setHandler()` for handlers + +### Activity Definition +- Regular async functions +- Can perform I/O, network calls, etc. +- Use `heartbeat()` for long operations + +### Worker Setup +- Use `Worker.create()` with `workflowsPath` (dev) or `workflowBundle` (production) - see `references/typescript/gotchas.md` +- Import activities directly (not via proxy) + +## File Organization Best Practice + +**Keep Workflow definitions in separate files from Activity definitions.** The TypeScript SDK bundles workflow files separately. Minimizing workflow file contents improves Worker startup time. + +``` +my_temporal_app/ +├── workflows/ +│ └── greeting.ts # Only Workflow functions +├── activities/ +│ └── translate.ts # Only Activity functions +├── worker.ts # Worker setup, imports both +└── client.ts # Client code to start workflows +``` + +**In the Workflow file, use type-only imports for activities:** +```typescript +// workflows/greeting.ts +import { proxyActivities } from '@temporalio/workflow'; +import type * as activities from '../activities/translate'; + +const { translate } = proxyActivities({ + startToCloseTimeout: '1 minute', +}); +``` + +## Determinism Rules + +The TypeScript SDK runs workflows in an isolated V8 sandbox. + +**Automatic replacements:** +- `Math.random()` → deterministic seeded PRNG +- `Date.now()` → workflow start time +- `setTimeout` → deterministic timer + +**Safe to use:** +- `sleep()` from `@temporalio/workflow` +- `condition()` for waiting +- Standard JavaScript operations + +See `references/typescript/determinism.md` for detailed rules. + +## Common Pitfalls + +1. **Importing activities without `type`** - Use `import type * as activities` +2. **Version mismatch** - All @temporalio packages must match +3. **Direct I/O in workflows** - Use activities for external calls +4. **Missing `proxyActivities`** - Required to call activities from workflows +5. **Forgetting to bundle workflows** - Worker needs `workflowsPath` or `workflowBundle` +6. **Using workflowsPath in production** - Use `workflowBundle` for production (see `references/typescript/gotchas.md`) +7. **Forgetting to heartbeat** - Long-running activities need `heartbeat()` calls +8. **Logging in workflows** - For observability, use `import { log } from '@temporalio/workflow'` (routes through sinks). For temporary print debugging, `console.log()` is fine—it's direct and immediate, whereas `log` may lose messages on workflow errors. +9. **Forgetting to wait on activity calls** - Activity calls return Promises; you must eventually await them (directly or via `Promise.all()` for parallel execution) + +## Writing Tests + +See `references/typescript/testing.md` for info on writing tests. + +## Additional Resources + +### Reference Files +- **`references/typescript/patterns.md`** - Signals, queries, child workflows, saga pattern, etc. +- **`references/typescript/determinism.md`** - Essentials of determinism in TypeScript +- **`references/typescript/gotchas.md`** - TypeScript-specific mistakes and anti-patterns +- **`references/typescript/error-handling.md`** - ApplicationFailure, retry policies, non-retryable errors +- **`references/typescript/observability.md`** - Logging, metrics, tracing +- **`references/typescript/testing.md`** - TestWorkflowEnvironment, time-skipping, activity mocking +- **`references/typescript/advanced-features.md`** - Schedules, worker tuning, and more +- **`references/typescript/data-handling.md`** - Data converters, payload encryption, etc. +- **`references/typescript/versioning.md`** - Patching API, workflow type versioning, Worker Versioning +- **`references/typescript/determinism-protection.md`** - V8 sandbox and bundling diff --git a/references/typescript/versioning.md b/references/typescript/versioning.md new file mode 100644 index 0000000..a9f57a2 --- /dev/null +++ b/references/typescript/versioning.md @@ -0,0 +1,211 @@ +# TypeScript SDK Versioning + +For conceptual overview and guidance on choosing an approach, see `references/core/versioning.md`. + +## Patching API + +The Patching API lets you change Workflow Definitions without causing non-deterministic behavior in running Workflows. + +### The patched() Function + +The `patched()` function takes a `patchId` string and returns a boolean: + +```typescript +import { patched } from '@temporalio/workflow'; + +export async function myWorkflow(): Promise { + if (patched('my-change-id')) { + // New code path + await newImplementation(); + } else { + // Old code path (for replay of existing executions) + await oldImplementation(); + } +} +``` + +**How it works:** +- If the Workflow is running for the first time, `patched()` returns `true` and inserts a marker into the Event History +- During replay, if the history contains a marker with the same `patchId`, `patched()` returns `true` +- During replay, if no matching marker exists, `patched()` returns `false` + +**TypeScript-specific behavior:** Unlike Python/.NET/Ruby, `patched()` is not memoized when it returns `false`. This means you can use `patched()` in loops. However, if a single patch requires coordinated behavioral changes at different points in your workflow, you may need to manually memoize the result: + +```typescript +const useNewBehavior = patched('my-change'); +// Use useNewBehavior at multiple points in workflow +``` + +### Three-Step Patching Process + +Patching is a three-step process for safely deploying changes. + +**Warning:** Failing to follow this process correctly will result in non-determinism errors for in-flight workflows. + +#### Step 1: Patch in New Code + +Add the patch alongside the old code: + +```typescript +import { patched } from '@temporalio/workflow'; + +// Original code sent fax notifications +export async function shippingConfirmation(): Promise { + if (patched('changedNotificationType')) { + await sendEmail(); // New code + } else { + await sendFax(); // Old code for replay + } + await sleep('1 day'); +} +``` + +#### Step 2: Deprecate the Patch + +Once all Workflows using the old code have completed, deprecate the patch: + +```typescript +import { deprecatePatch } from '@temporalio/workflow'; + +export async function shippingConfirmation(): Promise { + deprecatePatch('changedNotificationType'); + await sendEmail(); + await sleep('1 day'); +} +``` + +The `deprecatePatch()` function records a marker that does not fail replay when Workflow code does not emit it, allowing a transition period. + +#### Step 3: Remove the Patch + +After all Workflows using `deprecatePatch` have completed, remove it entirely: + +```typescript +export async function shippingConfirmation(): Promise { + await sendEmail(); + await sleep('1 day'); +} +``` + +### Query Filters for Versioned Workflows + +Use List Filters to find Workflows by version: + +``` +# Find running Workflows with a specific patch +WorkflowType = "shippingConfirmation" AND ExecutionStatus = "Running" AND TemporalChangeVersion = "changedNotificationType" + +# Find running Workflows without the patch (started before patching) +WorkflowType = "shippingConfirmation" AND ExecutionStatus = "Running" AND TemporalChangeVersion IS NULL +``` + +## Workflow Type Versioning + +An alternative to patching is creating new Workflow functions for incompatible changes: + +```typescript +// Original Workflow +export async function pizzaWorkflow(order: PizzaOrder): Promise { + // Original implementation +} + +// New version with incompatible changes +export async function pizzaWorkflowV2(order: PizzaOrder): Promise { + // Updated implementation +} +``` + +Register both Workflows with the Worker: + +```typescript +const worker = await Worker.create({ + workflowsPath: require.resolve('./workflows'), // Use workflowBundle for production + taskQueue: 'pizza-queue', +}); +``` + +Update client code to start new Workflows with the new type: + +```typescript +// Start new executions with V2 +await client.workflow.start(pizzaWorkflowV2, { + workflowId: 'order-123', + taskQueue: 'pizza-queue', + args: [order], +}); +``` + +Use List Filters to check for remaining V1 executions: + +``` +WorkflowType = "pizzaWorkflow" AND ExecutionStatus = "Running" +``` + +After all V1 executions complete, remove the old Workflow function. + +## Worker Versioning + +Worker Versioning allows multiple Worker versions to run simultaneously, routing Workflows to specific versions without code-level patching. Workflows are pinned to the Worker Deployment Version they started on. + +> **Note:** Worker Versioning is currently in Public Preview. The legacy Worker Versioning API (before 2025) will be removed from Temporal Server in March 2026. + +### Key Concepts + +- **Worker Deployment**: A logical name for your application (e.g., "order-service") +- **Worker Deployment Version**: A specific build identified by deployment name + Build ID +- **Workflow Pinning**: Workflows complete on the Worker Deployment Version they started on + +### Configuring Workers for Versioning + +```typescript +import { Worker, NativeConnection } from '@temporalio/worker'; + +const worker = await Worker.create({ + workflowsPath: require.resolve('./workflows'), // Use workflowBundle for production + taskQueue: 'my-queue', + connection: await NativeConnection.connect({ address: 'temporal:7233' }), + workerDeploymentOptions: { + useWorkerVersioning: true, + version: { + deploymentName: 'order-service', + buildId: '1.0.0', // Git hash, semver, build number, etc. + }, + }, +}); +``` + +**Configuration options:** +- `useWorkerVersioning`: Enables Worker Versioning +- `version.deploymentName`: Logical name for your service (consistent across versions) +- `version.buildId`: Unique identifier for this build + +### Deployment Workflow + +1. Deploy new Worker version with a new `buildId` +2. Use the Temporal CLI to set the new version as current: + ```bash + temporal worker deployment set-current-version \ + --deployment-name order-service \ + --build-id 2.0.0 + ``` +3. New Workflows start on the new version +4. Existing Workflows continue on their original version until completion +5. Decommission old Workers once all their Workflows complete + +### When to Use Worker Versioning + +Worker Versioning is best suited for: +- **Short-running Workflows**: Old Workers only need to run briefly during deployment transitions +- **Frequent deployments**: Eliminates the need for code-level patching on every change +- **Blue-green deployments**: Run old and new versions simultaneously with traffic control + +For long-running Workflows, consider combining Worker Versioning with the Patching API, or use Continue-as-New to move Workflows to newer versions. + +## Best Practices + +1. Use descriptive `patchId` names that explain the change +2. Follow the three-step patching process completely before removing patches +3. Use List Filters to verify no running Workflows before removing version support +4. Keep Worker Deployment names consistent across all versions +5. Use unique, traceable Build IDs (git hashes, semver, timestamps) +6. Test version transitions with replay tests before deploying