diff --git a/website/blog/posts/2026-03-31-data-primitive-for-the-agent-loop.md b/website/blog/posts/2026-03-31-data-primitive-for-the-agent-loop.md
new file mode 100644
index 0000000000..0ab163ce21
--- /dev/null
+++ b/website/blog/posts/2026-03-31-data-primitive-for-the-agent-loop.md
@@ -0,0 +1,469 @@
+---
+title: "Durable\u00A0Streams \u2014 the data primitive for the agent loop"
+description: >-
+ Agents accumulate state in the agent loop — messages, tool calls, results,
+ observations. This is a new kind of data with demands that existing
+ infrastructure wasn't designed for. Durable Streams is the purpose-built
+ primitive that meets them.
+excerpt: >-
+ Agents are stateful. The agent loop accumulates a new kind of data that
+ needs a new kind of primitive. Durable Streams is that primitive.
+authors: [thruflo]
+image: /img/blog/durable-streams-data-primitive-for-the-agent-loop/header.jpg
+tags: [durable-streams, agents, sync]
+outline: [2, 3]
+post: true
+published: true
+---
+
+
+
+Durable Streams is the data primitive for the agent loop. Agents are
+stateful — they accumulate messages, tool calls, results and observations
+as they execute. This state is a new kind of data with demands that existing
+infrastructure wasn't designed for.
+
+A durable stream is a persistent, addressable, append-only log that's
+reactive, subscribable, replayable, forkable and structured. It's the
+primitive this data needs.
+
+> [!Tip]
Durable Streams — docs and deployment
+> See the [Durable Streams docs](https://durablestreams.com) and [deploy now on Electric Cloud](https://dashboard.electric-sql.cloud/?intent=create&serviceType=streams).
+
+
+
+## The agent loop
+
+
+
+
+
+- agents are being deployed at massive and accelerating scale — projections
+ run to trillions of agent instances
+- the core execution pattern behind them is the agent loop: a cycle of
+ observe → think → act → observe
+- the agent receives a task, reasons about what to do, selects and executes
+ an action (API call, code execution, search, file edit), then feeds the
+ result back into its own context as a new observation
+- this loop repeats — each iteration is a full inference call where the model
+ decides whether to continue acting or return a final answer
+- with every iteration, state accumulates: messages, tool calls, tool call
+ results, observations, artifacts
+- this accumulated state is the value — the longer the loop runs, the more
+ work gets done, the more automation, the more value for the organization
+- this is a genuinely new kind of data — it didn't used to exist; as an
+ industry we're still figuring out what it is and how to work with it
+
+
+
+
+
+
+
+- we've seen this firsthand — we built reactive sync infrastructure and
+ some of the best agentic teams and frameworks built on it
+- we were prototyping AI SDK transport integrations on our Postgres sync
+ service; we did the math: 50 tokens per second across a thousand
+ concurrent sessions is 50,000 writes per second; Postgres maxes out
+ around 20k; factor in centralized latency and it didn't add up
+- through working with pioneers like `` and agentic products like
+ ``, we saw that previous-generation databases
+ and object storage were not fit for purpose — even when we'd made them
+ reactive and synced-up, they weren't what agents needed
+- but we realized the delivery protocol was fine — it was going through the
+ database that was the problem; we wanted to write directly into the back
+ of a "shape" (our subscribable partial-replication primitive); a shape was
+ already an addressable, subscribable log; strip out the database, let
+ agents write directly into the log, and you have a durable stream
+- that's why we generalized our sync protocol into
+ [Durable Streams](/primitives/durable-streams) — this post shares
+ what we learned about what agent state demands and what we built to meet it
+
+
+
+## What agent state demands
+
+
+
+
+
+### How humans need to use it
+
+- real-time collaboration — multiple users and agents working on the same
+ session at the same time; not the current pattern of one person driving a
+ [Claude Code](https://claude.com/claude-code) or
+ [Codex](https://openai.com/index/codex) session while others watch on a
+ screen share
+- async collaboration — your colleague picks up where you left off; your
+ boss reviews what happened tomorrow; governance teams trace decisions back
+ to their source
+- observability — what happened in this session? why did the agent make that
+ choice? where did it go wrong? how do I restart it from that good
+ checkpoint before it went off the rails?
+
+### How agents need to use it
+
+- spawning hierarchies — parent agents spawn children, each running their
+ own loop, reporting results back up; is that child stuck? do I need to
+ talk to it directly?
+- forking — branch a session to explore alternatives; go back to a known
+ good point and try a different path
+- time travel — replay from any position; restart from a checkpoint
+- compaction and memory — compress and summarize what happened; capture
+ observational memory across sessions; what did the children do?
+
+
+
+### How organizations need to use it
+
+- if work is done through agents, agent sessions become how work gets
+ done — so they inherit the governance requirements of the work itself
+- this state must wire into collaboration, reporting, access control
+ and compliance — the same systems the organization already runs on
+
+### The requirements
+
+
+
+- **persistent** — survives disconnects, restarts, crashes
+- **addressable** — sessions have their own URL; other systems can find and
+ subscribe to them
+- **reactive and subscribable** — real-time updates as state changes
+- **replayable** — consume from any position in the log
+- **forkable** — branch sessions for exploration
+- **lightweight and low-latency** — co-located with agents, not a round-trip
+ to a centralized store
+- **directly writable** — agents write to the stream as they execute
+- **structured** — schema support for typed, multiplexed data
+
+## Why existing infrastructure doesn't serve it
+
+
+
+### Ad-hoc and single-machine solutions
+
+- right now agent state is everything from ephemeral and lost, to markdown
+ files in hidden folders, to over-provisioned database tables
+- much of today's agent tooling is single-machine harnesses — powerful, but
+ hard for other users to access in real-time
+- the move into the cloud, into online systems wired into teams, is
+ happening fast — the OS-level primitives that work on a single machine
+ (files, signals, process watching) need web-scale equivalents
+
+
+
+### Databases
+
+- too heavy and centralized for this use case — you don't run an AI token
+ stream through your main Postgres
+- designed for structured queries and transactions, not append-only streaming
+ with real-time subscriptions
+- the latency and overhead of a centralized database doesn't match the
+ co-located, low-latency demands of agent state
+
+### Object storage
+
+- provides underlying durability — good as a backing store for agent sessions
+- but not reactive, not subscribable, not structured
+- a storage layer, not a data primitive with the affordances agents need
+
+### Redis
+
+- closer — in-memory, low-latency, pub/sub capabilities
+- but still centralized; a generalized data structure server, not
+ agent-specific
+- schema support has to be built on top; reactivity is not first-class —
+ pub/sub is fire-and-forget with no replay or persistence guarantees
+
+### The gap
+
+
+
+- each covers some of the requirements; none covers the full set
+- the agent loop needs a purpose-built primitive
+
+## Why Durable Streams are the solution
+
+
+
+### What is a durable stream
+
+- a persistent, addressable, append-only log with its own URL
+- write directly, subscribe in real-time, replay from any position
+- at the core, extremely simple: an append-only binary log
+- built on standard HTTP — works everywhere, cacheable, scalable through
+ existing CDN infrastructure
+- a generalization of the battle-tested
+ [Electric sync protocol](/docs/api/http) that delivers millions of state
+ changes daily
+
+
+
+### How it maps to the requirements
+
+
+
+- **persistent** — streams have their own durable storage; the data
+ survives anything
+- **addressable** — every stream has a URL, every position has an opaque
+ monotonic offset
+- **reactive and subscribable** — long-polling or SSE for real-time tailing;
+ clients subscribe and get updates as they're written
+- **replayable** — read from any offset; catch up from any point in history;
+ clients track their own position
+- **forkable** — create new streams from positions in existing streams
+- **lightweight and low-latency** — minimal protocol overhead; single-digit
+ ms at CDN edge; co-locatable with agents
+- **directly writable** — append with POST, get the next offset in the
+ response header
+- **structured** — wrapper protocols layer typed schemas on top of the binary
+ stream using [Standard Schema](https://standardschema.dev) for end-to-end
+ type safety
+
+### Wrapper protocols and structured sessions
+
+
+
+- **durable state** — schema-aware structured state sync over a durable
+ stream; multiplexes messages, presence, agent registration, tool calls
+ over a single stream
+- **AI SDK transports** — drop-in adapters for
+ [Vercel AI SDK](/blog/2026/03/24/durable-transport-ai-sdks) and
+ [TanStack AI](/blog/2026/01/12/durable-sessions-for-collaborative-ai)
+ that make existing AI apps durable and collaborative
+- **binary, JSON, proxy modes** — different encodings for different
+ use cases
+
+
+
+### What this unlocks
+
+
+
+- resilient agent sessions — disconnect, reconnect, resume without
+ re-running expensive work
+- multi-user collaboration — multiple people working on the same agentic
+ session in real-time
+- multi-agent collaboration — agents subscribing to and building on each
+ other's work
+- spawning and forking — hierarchies of agents with durable state at
+ every level
+- async governance — full history of every agent action, available for
+ audit and compliance
+- massive fan-out — scale to millions of concurrent subscribers through
+ CDN infrastructure
+
+> [!Info] See it in action
+> See the [Durable Sessions](/blog/2026/01/12/durable-sessions-for-collaborative-ai)
+> post for a working reference implementation with
+> [demo video](https://youtu.be/81KXwxld7dw) and
+> [source code](https://github.com/electric-sql/transport), and the
+> [Durable Transports](/blog/2026/03/24/durable-transport-ai-sdks)
+> post for AI SDK integration.
+
+
+
+## Next steps
+
+- explore [DurableStreams.com](https://durablestreams.com) — docs,
+ quickstart, protocol spec
+- deploy on [Electric Cloud](/cloud) — hosted durable streams
+- join [Discord](https://discord.electric-sql.com) — to define and
+ explore this space together
+
+
+
+Agents are stateful. The agent loop accumulates state with every iteration.
+This state is the value — more state, more automation, more value. It needs
+a new primitive. One that's native to and designed for the unique
+requirements of the agent loop.
+
+That's a [Durable Stream](https://durablestreams.com).
+
+***
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+