Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
50 commits
Select commit Hold shift + click to select a range
5292b3a
Add back in TypeScript directory - not yet edited at all.
donald-pinckney Feb 23, 2026
24a3eaa
Revert "Remove TypeScript hints"
donald-pinckney Feb 23, 2026
225b628
work through typescript determinism
donald-pinckney Feb 25, 2026
4792946
improve cross-references
donald-pinckney Feb 25, 2026
e8a3bdd
align with python patterns.
donald-pinckney Feb 25, 2026
1b207f8
bulk alignment analysis + editing prep
donald-pinckney Feb 25, 2026
41556ed
bulk apply edits
donald-pinckney Feb 25, 2026
ed9d2fc
update tracking file
donald-pinckney Feb 26, 2026
2dd0711
clean up tmp files
donald-pinckney Feb 26, 2026
51b8133
more tmp file cleanup and reorg
donald-pinckney Feb 26, 2026
52126fe
x
donald-pinckney Feb 26, 2026
3f8b790
Various edits to TypeScript
donald-pinckney Feb 27, 2026
e0aec97
versioning cleanup
donald-pinckney Feb 27, 2026
6455a09
revert python files back to dev
donald-pinckney Feb 27, 2026
c64b0d5
fix link
donald-pinckney Feb 27, 2026
b3de6c7
Typo fix
donald-pinckney Feb 27, 2026
37afc11
Update references/typescript/typescript.md
donald-pinckney Mar 3, 2026
e19b37d
Update references/typescript/typescript.md
donald-pinckney Mar 3, 2026
c17e9e9
remove otel for now
donald-pinckney Mar 3, 2026
a53bec5
fix console.log
donald-pinckney Mar 3, 2026
18686b9
remove more otel
donald-pinckney Mar 3, 2026
12601e1
package manager agnostic
donald-pinckney Mar 3, 2026
2d3e66f
improve bundleing
donald-pinckney Mar 3, 2026
da15fef
cut uuid section
donald-pinckney Mar 3, 2026
6dc6493
oops, restore python
donald-pinckney Mar 3, 2026
e5e6caa
Clean up Date.now()
donald-pinckney Mar 3, 2026
366792d
Fix "bad" workflow thats not bad at all
donald-pinckney Mar 3, 2026
aa40658
soften sleep
donald-pinckney Mar 3, 2026
8a627f9
improve timestamp language
donald-pinckney Mar 3, 2026
8678055
Caveat protobuf
donald-pinckney Mar 4, 2026
1dc5220
Add size limits
donald-pinckney Mar 4, 2026
cf15c53
re-work large payload handling
donald-pinckney Mar 4, 2026
b11b28f
don't always re-throw
donald-pinckney Mar 4, 2026
81bbcc6
improve logs
donald-pinckney Mar 4, 2026
2d10075
Add missing await gotcha
donald-pinckney Mar 4, 2026
b59282f
caveat dynamic singnal handlers
donald-pinckney Mar 4, 2026
340bca3
caveat dynamic query handlers
donald-pinckney Mar 4, 2026
21c3b86
warning on patching process
donald-pinckney Mar 4, 2026
56e5bc4
fix activity logger rationale
donald-pinckney Mar 4, 2026
fe82cb9
Simplify timers
donald-pinckney Mar 4, 2026
6841d58
Heartbeat info
donald-pinckney Mar 4, 2026
08b6957
fix wrong setTimeout
donald-pinckney Mar 4, 2026
a2c2f1c
correct when to use patching
donald-pinckney Mar 5, 2026
36a1197
add context to waiting on handlers
donald-pinckney Mar 5, 2026
4ca3291
~ version constraints
donald-pinckney Mar 5, 2026
0d22b88
npm caveat
donald-pinckney Mar 5, 2026
048ce45
Add understanding of patching memoization
donald-pinckney Mar 5, 2026
168a6bf
heartbeat -> handle cancellation
donald-pinckney Mar 5, 2026
b72f188
Improve activity cancellation in gotchas
donald-pinckney Mar 5, 2026
1ebcf2b
fix async completion client
donald-pinckney Mar 6, 2026
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 4 additions & 3 deletions SKILL.md
Original file line number Diff line number Diff line change
@@ -1,14 +1,14 @@
---
name: temporal-developer
description: This skill should be used when the user asks to "create a Temporal workflow", "write a Temporal activity", "debug stuck workflow", "fix non-determinism error", "Temporal Python", "workflow replay", "activity timeout", "signal workflow", "query workflow", "worker not starting", "activity keeps retrying", "Temporal heartbeat", "continue-as-new", "child workflow", "saga pattern", "workflow versioning", "durable execution", "reliable distributed systems", or mentions Temporal SDK development.
description: This skill should be used when the user asks to "create a Temporal workflow", "write a Temporal activity", "debug stuck workflow", "fix non-determinism error", "Temporal Python", "Temporal TypeScript", "workflow replay", "activity timeout", "signal workflow", "query workflow", "worker not starting", "activity keeps retrying", "Temporal heartbeat", "continue-as-new", "child workflow", "saga pattern", "workflow versioning", "durable execution", "reliable distributed systems", or mentions Temporal SDK development.
version: 1.0.0
---

# Skill: temporal-developer

## Overview

Temporal is a durable execution platform that makes workflows survive failures automatically. This skill provides guidance for building Temporal applications in Python.
Temporal is a durable execution platform that makes workflows survive failures automatically. This skill provides guidance for building Temporal applications in Python and TypeScript.

## Core Architecture

Expand Down Expand Up @@ -91,6 +91,7 @@ Once you've downloaded the file, extract the downloaded archive and add the temp

1. First, read the getting started guide for the language you are working in:
- Python -> read `references/python/python.md`
- TypeScript -> read `references/typescript/typescript.md`
2. Second, read appropriate `core` and language-specific references for the task at hand.


Expand All @@ -108,7 +109,7 @@ Once you've downloaded the file, extract the downloaded archive and add the temp
- **`references/core/interactive-workflows.md`** - Testing signals, updates, queries
- **`references/core/dev-management.md`** - Dev cycle & management of server and workers
- **`references/core/ai-patterns.md`** - AI/LLM pattern concepts
+ Langauge-specific info at `references/{your_language}/determinism.md`
+ Language-specific info at `references/{your_language}/determinism.md`, if available. Currently Python only.

## Additional Topics
- **`references/{your_langauge}/observability.md`** - See for language-specific implementation guidance on observability in Temporal
Expand Down
1 change: 1 addition & 0 deletions references/core/determinism.md
Original file line number Diff line number Diff line change
Expand Up @@ -79,6 +79,7 @@ For a few simple cases, like timestamps, random values, UUIDs, etc. the Temporal
Each Temporal SDK language provides a protection mechanism to make it easier to catch non-determinism errors earlier in development:

- Python: The Python SDK runs workflows in a sandbox that intercepts and aborts non-deterministic calls at runtime.
- TypeScript: The TypeScript SDK runs workflows in an isolated V8 sandbox, intercepting many common sources of non-determinism and replacing them automatically with deterministic variants.


## Detecting Non-Determinism
Expand Down
44 changes: 44 additions & 0 deletions references/core/gotchas.md
Original file line number Diff line number Diff line change
Expand Up @@ -150,3 +150,47 @@ See language-specific gotchas for details.
**The Fix**:
- **Retryable**: Network errors, timeouts, rate limits, temporary unavailability
- **Non-retryable**: Invalid input, authentication failures, business rule violations, resource not found

## Cancellation Handling

### Not Handling Workflow Cancellation

**The Problem**: When a workflow is cancelled, cleanup code after the cancellation point doesn't run unless explicitly protected.

**Symptoms**:
- Resources not released after cancellation
- Incomplete compensation/rollback
- Leaked state

**The Fix**: Use language-specific cancellation scopes or try/finally blocks to ensure cleanup runs even on cancellation. See language-specific gotchas for implementation details.

### Not Handling Activity Cancellation

**The Problem**: Activities must opt in to receive cancellation. Without proper handling, a cancelled activity continues running to completion, wasting resources.

**Requirements for activity cancellation**:
1. **Heartbeating** - Cancellation is delivered via heartbeat. Activities that don't heartbeat won't know they've been cancelled.
2. **Checking for cancellation** - Activity must explicitly check for cancellation or await a cancellation signal.

**Symptoms**:
- Cancelled activities running to completion
- Wasted compute on work that will be discarded
- Delayed workflow cancellation

**The Fix**: Heartbeat regularly and check for cancellation. See language-specific gotchas for implementation patterns.

## Payload Size Limits

**The Problem**: Temporal has built-in limits on payload sizes. Exceeding them causes workflows to fail.

**Limits**:
- Max 2MB per individual payload
- Max 4MB per gRPC message
- Max 50MB for complete workflow history (aim for <10MB in practice)

**Symptoms**:
- Payload too large errors
- gRPC message size exceeded errors
- Workflow history growing unboundedly

**The Fix**: Store large data externally (S3/GCS) and pass references, use compression codecs, or chunk data across multiple activities. See the Large Data Handling pattern in `references/core/patterns.md`.
74 changes: 74 additions & 0 deletions references/core/patterns.md
Original file line number Diff line number Diff line change
Expand Up @@ -331,6 +331,80 @@ Run:

This ensures that on replay, already-completed steps are skipped.

## Large Data Handling

**Purpose**: Handle data that exceeds Temporal's payload limits without polluting workflow history.

**Limits** (see `references/core/gotchas.md` for details):
- Max 2MB per individual payload
- Max 4MB per gRPC message
- Max 50MB for workflow history (aim for <10MB)

**Key Principle**: Large data should never flow through workflow history. Activities read and write large data directly, passing only small references through the workflow.

**Wrong Approach**:
```
Workflow
├── downloadFromStorage(ref) ──▶ returns large data (enters history)
├── processData(largeData) ────▶ large data as argument (enters history AGAIN)
└── uploadToStorage(result) ───▶ large data as argument (enters history AGAIN)
```

This defeats the purpose—large data enters workflow history multiple times.

**Correct Approach**:
```
Workflow
└── processLargeData(inputRef) ──▶ returns outputRef (small string)
└── Activity internally:
download(inputRef) → process → upload → return outputRef
```

The workflow only handles references (small strings). The activity does all large data operations internally.

**Implementation Pattern**:
1. Accept a reference (URL, S3 key, database ID) as activity input
2. Download/fetch the large data inside the activity
3. Process the data inside the activity
4. Upload/store the result inside the activity
5. Return only a reference to the result

**Other Strategies**:
- **Compression**: Use a PayloadCodec to compress data automatically
- **Chunking**: Split large collections across multiple activities, each handling a subset

## Activity Heartbeating

**Purpose**: Enable cancellation delivery and progress tracking for long-running activities.

**Why Heartbeat**:
1. **Support activity cancellation** - Cancellations are delivered to activities via heartbeat. Activities that don't heartbeat won't know they've been cancelled.
2. **Resume progress after failure** - Heartbeat details persist across retries, allowing activities to resume where they left off.
3. **Detect stuck activities** - If an activity stops heartbeating, Temporal can time it out and retry.

**How Cancellation Works**:
```
Workflow requests activity cancellation
Temporal Service marks activity for cancellation
Activity calls heartbeat()
├── Not cancelled: heartbeat succeeds, continues
└── Cancelled: heartbeat raises exception
Activity can catch this to perform cleanup
```

**Key Point**: If an activity never heartbeats, it will run to completion even if cancelled—it has no way to learn about the cancellation.

## Local Activities

**Purpose**: Reduce latency for short, lightweight operations by skipping the task queue. ONLY use these when necessary for performance. Do NOT use these by default, as they are not durable and distributed.
Expand Down
16 changes: 10 additions & 6 deletions references/core/versioning.md
Original file line number Diff line number Diff line change
Expand Up @@ -53,17 +53,21 @@ else:

### When to Use

- Adding new activities or steps
- Changing activity parameters
- Reordering operations
- Any change that would cause non-determinism
- Adding, removing, or reordering activities/child workflows
- Changing which activity/child workflow is called
- Any change that alters the Command sequence

### When NOT to Use

- Changes to activity implementations (activities aren't replayed)
- Adding new signal/query handlers (additive changes are safe)
- Changing activity implementations (activities aren't replayed)
- Changing arguments passed to activities or child workflows
- Changing retry policies
- Changing timer durations
- Adding new signal/query/update handlers (additive changes are safe)
- Bug fixes that don't change Command sequence

Unnecessary patching adds complexity and can make workflow code unmanageable.

## Approach 2: Workflow Type Versioning

### Concept
Expand Down
30 changes: 3 additions & 27 deletions references/python/data-handling.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,29 +202,6 @@ class OrderWorkflow:
...
```

## Large Payloads

For large data, consider:

1. **Store externally**: Put large data in S3/GCS, pass references in workflows
2. **Use Payload Codec**: Compress payloads automatically
3. **Chunk data**: Split large lists across multiple activities

```python
# Example: Reference pattern for large data
@activity.defn
async def upload_to_storage(data: bytes) -> str:
"""Upload data and return reference."""
key = f"data/{uuid.uuid4()}"
await storage_client.upload(key, data)
return key

@activity.defn
async def download_from_storage(key: str) -> bytes:
"""Download data by reference."""
return await storage_client.download(key)
```

## Deterministic APIs for Values

Use these APIs within workflows for deterministic random values and UUIDs:
Expand All @@ -247,8 +224,7 @@ class MyWorkflow:
## Best Practices

1. Use Pydantic for input/output validation
2. Keep payloads small (< 2MB recommended)
2. Keep payloads small—see `references/core/gotchas.md` for limits
3. Encrypt sensitive data with PayloadCodec
4. Store large data externally with references
5. Use dataclasses for simple data structures
6. Use `workflow.uuid4()` and `workflow.random()` for deterministic values
4. Use dataclasses for simple data structures
5. Use `workflow.uuid4()` and `workflow.random()` for deterministic values
77 changes: 77 additions & 0 deletions references/python/gotchas.md
Original file line number Diff line number Diff line change
Expand Up @@ -161,6 +161,83 @@ await workflow.execute_activity(
)
```

Set heartbeat timeout as high as acceptable for your use case — each heartbeat counts as an action.

## Cancellation

### Not Handling Workflow Cancellation

```python
# BAD - Cleanup doesn't run on cancellation
@workflow.defn
class BadWorkflow:
@workflow.run
async def run(self) -> None:
await workflow.execute_activity(
acquire_resource,
start_to_close_timeout=timedelta(minutes=5),
)
await workflow.execute_activity(
do_work,
start_to_close_timeout=timedelta(minutes=5),
)
await workflow.execute_activity(
release_resource, # Never runs if cancelled!
start_to_close_timeout=timedelta(minutes=5),
)

# GOOD - Use try/finally for cleanup
@workflow.defn
class GoodWorkflow:
@workflow.run
async def run(self) -> None:
await workflow.execute_activity(
acquire_resource,
start_to_close_timeout=timedelta(minutes=5),
)
try:
await workflow.execute_activity(
do_work,
start_to_close_timeout=timedelta(minutes=5),
)
finally:
# Runs even on cancellation
await workflow.execute_activity(
release_resource,
start_to_close_timeout=timedelta(minutes=5),
)
```

### Not Handling Activity Cancellation

Activities must **opt in** to receive cancellation. This requires:
1. **Heartbeating** - Cancellation is delivered via heartbeat
2. **Catching the cancellation exception** - Exception is raised when heartbeat detects cancellation

**Cancellation exceptions:**
- Async activities: `asyncio.CancelledError`
- Sync threaded activities: `temporalio.exceptions.CancelledError`

```python
# BAD - Activity ignores cancellation
@activity.defn
async def long_activity() -> None:
await do_expensive_work() # Runs to completion even if cancelled
```

```python
# GOOD - Heartbeat and catch cancellation
@activity.defn
async def long_activity() -> None:
try:
for item in items:
activity.heartbeat()
await process(item)
except asyncio.CancelledError:
await cleanup()
raise
```

## Testing

### Not Testing Failures
Expand Down
Loading