Skip to content

Commit 766f888

Browse files
donald-pinckneymjameswhchris-olszewski
authored
Add TypeScript (#31)
Adds initial support for TypeScript to the skill --------- Co-authored-by: James Watkins-Harvey <mjameswh@users.noreply.github.com> Co-authored-by: Chris Olszewski <chrisdolszewski@gmail.com>
1 parent 87fe92e commit 766f888

21 files changed

Lines changed: 2320 additions & 52 deletions

SKILL.md

Lines changed: 4 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1,14 +1,14 @@
11
---
22
name: temporal-developer
3-
description: This skill should be used when the user asks to "create a Temporal workflow", "write a Temporal activity", "debug stuck workflow", "fix non-determinism error", "Temporal Python", "workflow replay", "activity timeout", "signal workflow", "query workflow", "worker not starting", "activity keeps retrying", "Temporal heartbeat", "continue-as-new", "child workflow", "saga pattern", "workflow versioning", "durable execution", "reliable distributed systems", or mentions Temporal SDK development.
3+
description: This skill should be used when the user asks to "create a Temporal workflow", "write a Temporal activity", "debug stuck workflow", "fix non-determinism error", "Temporal Python", "Temporal TypeScript", "workflow replay", "activity timeout", "signal workflow", "query workflow", "worker not starting", "activity keeps retrying", "Temporal heartbeat", "continue-as-new", "child workflow", "saga pattern", "workflow versioning", "durable execution", "reliable distributed systems", or mentions Temporal SDK development.
44
version: 1.0.0
55
---
66

77
# Skill: temporal-developer
88

99
## Overview
1010

11-
Temporal is a durable execution platform that makes workflows survive failures automatically. This skill provides guidance for building Temporal applications in Python.
11+
Temporal is a durable execution platform that makes workflows survive failures automatically. This skill provides guidance for building Temporal applications in Python and TypeScript.
1212

1313
## Core Architecture
1414

@@ -91,6 +91,7 @@ Once you've downloaded the file, extract the downloaded archive and add the temp
9191

9292
1. First, read the getting started guide for the language you are working in:
9393
- Python -> read `references/python/python.md`
94+
- TypeScript -> read `references/typescript/typescript.md`
9495
2. Second, read appropriate `core` and language-specific references for the task at hand.
9596

9697

@@ -108,7 +109,7 @@ Once you've downloaded the file, extract the downloaded archive and add the temp
108109
- **`references/core/interactive-workflows.md`** - Testing signals, updates, queries
109110
- **`references/core/dev-management.md`** - Dev cycle & management of server and workers
110111
- **`references/core/ai-patterns.md`** - AI/LLM pattern concepts
111-
+ Langauge-specific info at `references/{your_language}/determinism.md`
112+
+ Language-specific info at `references/{your_language}/determinism.md`, if available. Currently Python only.
112113

113114
## Additional Topics
114115
- **`references/{your_langauge}/observability.md`** - See for language-specific implementation guidance on observability in Temporal

references/core/determinism.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -79,6 +79,7 @@ For a few simple cases, like timestamps, random values, UUIDs, etc. the Temporal
7979
Each Temporal SDK language provides a protection mechanism to make it easier to catch non-determinism errors earlier in development:
8080

8181
- Python: The Python SDK runs workflows in a sandbox that intercepts and aborts non-deterministic calls at runtime.
82+
- TypeScript: The TypeScript SDK runs workflows in an isolated V8 sandbox, intercepting many common sources of non-determinism and replacing them automatically with deterministic variants.
8283

8384

8485
## Detecting Non-Determinism

references/core/gotchas.md

Lines changed: 44 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -150,3 +150,47 @@ See language-specific gotchas for details.
150150
**The Fix**:
151151
- **Retryable**: Network errors, timeouts, rate limits, temporary unavailability
152152
- **Non-retryable**: Invalid input, authentication failures, business rule violations, resource not found
153+
154+
## Cancellation Handling
155+
156+
### Not Handling Workflow Cancellation
157+
158+
**The Problem**: When a workflow is cancelled, cleanup code after the cancellation point doesn't run unless explicitly protected.
159+
160+
**Symptoms**:
161+
- Resources not released after cancellation
162+
- Incomplete compensation/rollback
163+
- Leaked state
164+
165+
**The Fix**: Use language-specific cancellation scopes or try/finally blocks to ensure cleanup runs even on cancellation. See language-specific gotchas for implementation details.
166+
167+
### Not Handling Activity Cancellation
168+
169+
**The Problem**: Activities must opt in to receive cancellation. Without proper handling, a cancelled activity continues running to completion, wasting resources.
170+
171+
**Requirements for activity cancellation**:
172+
1. **Heartbeating** - Cancellation is delivered via heartbeat. Activities that don't heartbeat won't know they've been cancelled.
173+
2. **Checking for cancellation** - Activity must explicitly check for cancellation or await a cancellation signal.
174+
175+
**Symptoms**:
176+
- Cancelled activities running to completion
177+
- Wasted compute on work that will be discarded
178+
- Delayed workflow cancellation
179+
180+
**The Fix**: Heartbeat regularly and check for cancellation. See language-specific gotchas for implementation patterns.
181+
182+
## Payload Size Limits
183+
184+
**The Problem**: Temporal has built-in limits on payload sizes. Exceeding them causes workflows to fail.
185+
186+
**Limits**:
187+
- Max 2MB per individual payload
188+
- Max 4MB per gRPC message
189+
- Max 50MB for complete workflow history (aim for <10MB in practice)
190+
191+
**Symptoms**:
192+
- Payload too large errors
193+
- gRPC message size exceeded errors
194+
- Workflow history growing unboundedly
195+
196+
**The Fix**: Store large data externally (S3/GCS) and pass references, use compression codecs, or chunk data across multiple activities. See the Large Data Handling pattern in `references/core/patterns.md`.

references/core/patterns.md

Lines changed: 74 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -331,6 +331,80 @@ Run:
331331

332332
This ensures that on replay, already-completed steps are skipped.
333333

334+
## Large Data Handling
335+
336+
**Purpose**: Handle data that exceeds Temporal's payload limits without polluting workflow history.
337+
338+
**Limits** (see `references/core/gotchas.md` for details):
339+
- Max 2MB per individual payload
340+
- Max 4MB per gRPC message
341+
- Max 50MB for workflow history (aim for <10MB)
342+
343+
**Key Principle**: Large data should never flow through workflow history. Activities read and write large data directly, passing only small references through the workflow.
344+
345+
**Wrong Approach**:
346+
```
347+
Workflow
348+
349+
├── downloadFromStorage(ref) ──▶ returns large data (enters history)
350+
351+
├── processData(largeData) ────▶ large data as argument (enters history AGAIN)
352+
353+
└── uploadToStorage(result) ───▶ large data as argument (enters history AGAIN)
354+
```
355+
356+
This defeats the purpose—large data enters workflow history multiple times.
357+
358+
**Correct Approach**:
359+
```
360+
Workflow
361+
362+
└── processLargeData(inputRef) ──▶ returns outputRef (small string)
363+
364+
└── Activity internally:
365+
download(inputRef) → process → upload → return outputRef
366+
```
367+
368+
The workflow only handles references (small strings). The activity does all large data operations internally.
369+
370+
**Implementation Pattern**:
371+
1. Accept a reference (URL, S3 key, database ID) as activity input
372+
2. Download/fetch the large data inside the activity
373+
3. Process the data inside the activity
374+
4. Upload/store the result inside the activity
375+
5. Return only a reference to the result
376+
377+
**Other Strategies**:
378+
- **Compression**: Use a PayloadCodec to compress data automatically
379+
- **Chunking**: Split large collections across multiple activities, each handling a subset
380+
381+
## Activity Heartbeating
382+
383+
**Purpose**: Enable cancellation delivery and progress tracking for long-running activities.
384+
385+
**Why Heartbeat**:
386+
1. **Support activity cancellation** - Cancellations are delivered to activities via heartbeat. Activities that don't heartbeat won't know they've been cancelled.
387+
2. **Resume progress after failure** - Heartbeat details persist across retries, allowing activities to resume where they left off.
388+
3. **Detect stuck activities** - If an activity stops heartbeating, Temporal can time it out and retry.
389+
390+
**How Cancellation Works**:
391+
```
392+
Workflow requests activity cancellation
393+
394+
395+
Temporal Service marks activity for cancellation
396+
397+
398+
Activity calls heartbeat()
399+
400+
├── Not cancelled: heartbeat succeeds, continues
401+
402+
└── Cancelled: heartbeat raises exception
403+
Activity can catch this to perform cleanup
404+
```
405+
406+
**Key Point**: If an activity never heartbeats, it will run to completion even if cancelled—it has no way to learn about the cancellation.
407+
334408
## Local Activities
335409

336410
**Purpose**: Reduce latency for short, lightweight operations by skipping the task queue. ONLY use these when necessary for performance. Do NOT use these by default, as they are not durable and distributed.

references/core/versioning.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -53,17 +53,21 @@ else:
5353

5454
### When to Use
5555

56-
- Adding new activities or steps
57-
- Changing activity parameters
58-
- Reordering operations
59-
- Any change that would cause non-determinism
56+
- Adding, removing, or reordering activities/child workflows
57+
- Changing which activity/child workflow is called
58+
- Any change that alters the Command sequence
6059

6160
### When NOT to Use
6261

63-
- Changes to activity implementations (activities aren't replayed)
64-
- Adding new signal/query handlers (additive changes are safe)
62+
- Changing activity implementations (activities aren't replayed)
63+
- Changing arguments passed to activities or child workflows
64+
- Changing retry policies
65+
- Changing timer durations
66+
- Adding new signal/query/update handlers (additive changes are safe)
6567
- Bug fixes that don't change Command sequence
6668

69+
Unnecessary patching adds complexity and can make workflow code unmanageable.
70+
6771
## Approach 2: Workflow Type Versioning
6872

6973
### Concept

references/python/data-handling.md

Lines changed: 3 additions & 27 deletions
Original file line numberDiff line numberDiff line change
@@ -202,29 +202,6 @@ class OrderWorkflow:
202202
...
203203
```
204204

205-
## Large Payloads
206-
207-
For large data, consider:
208-
209-
1. **Store externally**: Put large data in S3/GCS, pass references in workflows
210-
2. **Use Payload Codec**: Compress payloads automatically
211-
3. **Chunk data**: Split large lists across multiple activities
212-
213-
```python
214-
# Example: Reference pattern for large data
215-
@activity.defn
216-
async def upload_to_storage(data: bytes) -> str:
217-
"""Upload data and return reference."""
218-
key = f"data/{uuid.uuid4()}"
219-
await storage_client.upload(key, data)
220-
return key
221-
222-
@activity.defn
223-
async def download_from_storage(key: str) -> bytes:
224-
"""Download data by reference."""
225-
return await storage_client.download(key)
226-
```
227-
228205
## Deterministic APIs for Values
229206

230207
Use these APIs within workflows for deterministic random values and UUIDs:
@@ -247,8 +224,7 @@ class MyWorkflow:
247224
## Best Practices
248225

249226
1. Use Pydantic for input/output validation
250-
2. Keep payloads small (< 2MB recommended)
227+
2. Keep payloads small—see `references/core/gotchas.md` for limits
251228
3. Encrypt sensitive data with PayloadCodec
252-
4. Store large data externally with references
253-
5. Use dataclasses for simple data structures
254-
6. Use `workflow.uuid4()` and `workflow.random()` for deterministic values
229+
4. Use dataclasses for simple data structures
230+
5. Use `workflow.uuid4()` and `workflow.random()` for deterministic values

references/python/gotchas.md

Lines changed: 77 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -161,6 +161,83 @@ await workflow.execute_activity(
161161
)
162162
```
163163

164+
Set heartbeat timeout as high as acceptable for your use case — each heartbeat counts as an action.
165+
166+
## Cancellation
167+
168+
### Not Handling Workflow Cancellation
169+
170+
```python
171+
# BAD - Cleanup doesn't run on cancellation
172+
@workflow.defn
173+
class BadWorkflow:
174+
@workflow.run
175+
async def run(self) -> None:
176+
await workflow.execute_activity(
177+
acquire_resource,
178+
start_to_close_timeout=timedelta(minutes=5),
179+
)
180+
await workflow.execute_activity(
181+
do_work,
182+
start_to_close_timeout=timedelta(minutes=5),
183+
)
184+
await workflow.execute_activity(
185+
release_resource, # Never runs if cancelled!
186+
start_to_close_timeout=timedelta(minutes=5),
187+
)
188+
189+
# GOOD - Use try/finally for cleanup
190+
@workflow.defn
191+
class GoodWorkflow:
192+
@workflow.run
193+
async def run(self) -> None:
194+
await workflow.execute_activity(
195+
acquire_resource,
196+
start_to_close_timeout=timedelta(minutes=5),
197+
)
198+
try:
199+
await workflow.execute_activity(
200+
do_work,
201+
start_to_close_timeout=timedelta(minutes=5),
202+
)
203+
finally:
204+
# Runs even on cancellation
205+
await workflow.execute_activity(
206+
release_resource,
207+
start_to_close_timeout=timedelta(minutes=5),
208+
)
209+
```
210+
211+
### Not Handling Activity Cancellation
212+
213+
Activities must **opt in** to receive cancellation. This requires:
214+
1. **Heartbeating** - Cancellation is delivered via heartbeat
215+
2. **Catching the cancellation exception** - Exception is raised when heartbeat detects cancellation
216+
217+
**Cancellation exceptions:**
218+
- Async activities: `asyncio.CancelledError`
219+
- Sync threaded activities: `temporalio.exceptions.CancelledError`
220+
221+
```python
222+
# BAD - Activity ignores cancellation
223+
@activity.defn
224+
async def long_activity() -> None:
225+
await do_expensive_work() # Runs to completion even if cancelled
226+
```
227+
228+
```python
229+
# GOOD - Heartbeat and catch cancellation
230+
@activity.defn
231+
async def long_activity() -> None:
232+
try:
233+
for item in items:
234+
activity.heartbeat()
235+
await process(item)
236+
except asyncio.CancelledError:
237+
await cleanup()
238+
raise
239+
```
240+
164241
## Testing
165242

166243
### Not Testing Failures

0 commit comments

Comments
 (0)