Skip to content

Commit 84f8a60

Browse files
yaron2msfussell
andauthored
Add CrewAI and workflow documentation (#4973)
* Add crewAI and workflow documentation Signed-off-by: yaron2 <schneider.yaron@live.com> * typos Signed-off-by: yaron2 <schneider.yaron@live.com> * review comments Signed-off-by: yaron2 <schneider.yaron@live.com> * switch order of .env and deps Signed-off-by: yaron2 <schneider.yaron@live.com> --------- Signed-off-by: yaron2 <schneider.yaron@live.com> Co-authored-by: Mark Fussell <markfussell@gmail.com>
1 parent 803744d commit 84f8a60

File tree

2 files changed

+223
-0
lines changed

2 files changed

+223
-0
lines changed
Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
---
2+
type: docs
3+
title: "CrewAI"
4+
linkTitle: "CrewAI"
5+
weight: 25
6+
description: "Dapr first-class integrations with CrewAI Agents"
7+
---
8+
9+
### What is the Dapr CrewAI integration?
10+
11+
Dapr provides CrewAI agents first class integrations that range from agent session management to connecting agents via pub/sub and orchestrating agentic workflows.
Lines changed: 212 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,212 @@
1+
---
2+
type: docs
3+
title: "CrewAI Workflows"
4+
linkTitle: "CrewAI Workflows"
5+
weight: 25
6+
description: "How to run CrewAI agents with durable, fault-tolerant execution using Dapr Workflows"
7+
---
8+
9+
## Overview
10+
11+
Dapr Workflows make it possible to run CrewAI agents **reliably**, **durably**, and **with built-in resiliency**.
12+
By orchestrating CrewAI tasks with the Dapr Workflow engine, developers can:
13+
14+
- Ensure long-running CrewAI work survives crashes and restarts.
15+
- Get automatic checkpoints, retries, and state recovery.
16+
- Run each CrewAI task as a durable activity.
17+
- Observe execution through tracing, metrics, and structured logs.
18+
19+
This guide walks through orchestrating multiple CrewAI tasks using Dapr Workflows, ensuring each step is run *exactly once* even if the process restarts.
20+
21+
## Getting Started
22+
23+
Initialize Dapr locally to set up a self-hosted environment for development. This process installs the Dapr sidecar binaries, provisions the workflow engine, and prepares a default components directory. For full details, see [guide on initializing Dapr locally]({{% ref install-dapr-selfhost.md %}}).
24+
25+
Initialize Dapr:
26+
27+
```bash
28+
dapr init
29+
```
30+
31+
Verify that daprio/dapr, openzipkin/zipkin, and redis are running:
32+
33+
```bash
34+
docker ps
35+
```
36+
37+
### Install Python
38+
39+
{{% alert title="Note" color="info" %}}
40+
Make sure you have Python already installed. `Python >=3.10`. For installation instructions, visit the official [Python installation guide](https://www.python.org/downloads/).
41+
{{% /alert %}}
42+
43+
### Create a Python Virtual Environment (recommended)
44+
45+
```bash
46+
python -m venv .venv
47+
source .venv/bin/activate # Windows: .venv\Scripts\activate
48+
```
49+
50+
### Install Dependencies
51+
52+
```bash
53+
pip install dapr dapr-ext-workflow crewai
54+
```
55+
56+
### Create a Workflow to Run CrewAI Tasks
57+
58+
Create a file named crewai_workflow.py and paste the following:
59+
60+
```python
61+
from dapr.ext.workflow import (
62+
WorkflowRuntime,
63+
DaprWorkflowContext,
64+
WorkflowActivityContext,
65+
DaprWorkflowClient,
66+
)
67+
from crewai import Agent, Task, Crew
68+
import time
69+
70+
wfr = WorkflowRuntime()
71+
72+
# ------------------------------------------------------------
73+
# 1. Define Agent, Tasks, and Task Dictionary
74+
# ------------------------------------------------------------
75+
agent = Agent(
76+
role="Research Analyst",
77+
goal="Research and summarize impactful technology updates.",
78+
backstory="A skilled analyst who specializes in researching and summarizing technology topics.",
79+
)
80+
81+
tasks = {
82+
"latest_ai_news": Task(
83+
description="Find the latest news about artificial intelligence.",
84+
expected_output="A 3-paragraph summary of the top 3 stories.",
85+
agent=agent,
86+
),
87+
"ai_startup_launches": Task(
88+
description="Summarize the most impactful AI startup launches in the last 6 months.",
89+
expected_output="A list summarizing 2 AI startups with links.",
90+
agent=agent,
91+
),
92+
"ai_policy_updates": Task(
93+
description="Summarize the newest AI government policy and regulation updates.",
94+
expected_output="A bullet-point list summarizing the latest policy changes.",
95+
agent=agent,
96+
),
97+
}
98+
99+
# ------------------------------------------------------------
100+
# 2. Activity — runs ONE task by name
101+
# ------------------------------------------------------------
102+
@wfr.activity(name="run_task")
103+
def run_task_activity(ctx: WorkflowActivityContext, task_name: str):
104+
print(f"Running CrewAI task: {task_name}", flush=True)
105+
106+
task = tasks[task_name]
107+
108+
# Create a Crew for just this one task
109+
temp_crew = Crew(agents=[agent], tasks=[task])
110+
111+
# kickoff() works across CrewAI versions
112+
result = temp_crew.kickoff()
113+
114+
return str(result)
115+
116+
# ------------------------------------------------------------
117+
# 3. Workflow — orchestrates tasks durably
118+
# ------------------------------------------------------------
119+
@wfr.workflow(name="crewai_multi_task_workflow")
120+
def crewai_workflow(ctx: DaprWorkflowContext):
121+
print("Starting multi-task CrewAI workflow", flush=True)
122+
123+
latest_news = yield ctx.call_activity(run_task_activity, input="latest_ai_news")
124+
startup_summary = yield ctx.call_activity(run_task_activity, input="ai_startup_launches")
125+
policy_updates = yield ctx.call_activity(run_task_activity, input="ai_policy_updates")
126+
127+
return {
128+
"latest_news": latest_news,
129+
"startup_summary": startup_summary,
130+
"policy_updates": policy_updates,
131+
}
132+
133+
# ------------------------------------------------------------
134+
# 4. Runtime + Client (entry point)
135+
# ------------------------------------------------------------
136+
if __name__ == "__main__":
137+
wfr.start()
138+
139+
client = DaprWorkflowClient()
140+
instance_id = "crewai-multi-01"
141+
142+
client.schedule_new_workflow(
143+
workflow=crewai_workflow,
144+
input=None,
145+
instance_id=instance_id
146+
)
147+
148+
state = client.wait_for_workflow_completion(instance_id, timeout_in_seconds=60)
149+
print(state.serialized_output)
150+
```
151+
152+
This CrewAI agent starts a workflow that does news gathering and summary for the subjects of AI and startups.
153+
154+
### Create the Workflow Database Component
155+
156+
Dapr Workflows persist durable state using any [Dapr state store]({{% ref supported-state-stores %}}) that supports workflows.
157+
Create a directory named `components`, then create the file workflowstore.yaml:
158+
159+
```yaml
160+
apiVersion: dapr.io/v1alpha1
161+
kind: Component
162+
metadata:
163+
name: workflowstore
164+
spec:
165+
type: state.redis
166+
version: v1
167+
metadata:
168+
- name: redisHost
169+
value: localhost:6379
170+
- name: redisPassword
171+
value: ""
172+
- name: actorStateStore
173+
value: "true"
174+
```
175+
176+
This component stores:
177+
178+
* Code execution checkpoints
179+
* Execution history
180+
* Deterministic resumption state
181+
* Final output data
182+
183+
### Set a CrewAI LLM Provider
184+
185+
CrewAI needs an LLM configuration or token to run. See instructions [here](https://docs.crewai.com/en/concepts/llms#setting-up-your-llm).
186+
187+
For example, to set up OpenAI:
188+
189+
```
190+
export OPENAI_API_KEY=sk-...
191+
```
192+
193+
### Run the Workflow
194+
195+
Launch the CrewAI workflow using the Dapr CLI:
196+
197+
```bash
198+
dapr run \
199+
--app-id crewaiwf \
200+
--dapr-grpc-port 50001 \
201+
--resources-path ./components \
202+
-- python3 ./crewai_workflow.py
203+
```
204+
205+
As the workflow runs, each CrewAI task is executed as a durable activity.
206+
If the process crashes, the workflow resumes exactly where it left off. You can try this by killing the process after the first activity and then rerunning that command line above with the same app ID.
207+
208+
Open Zipkin to view workflow traces:
209+
210+
```
211+
http://localhost:9411
212+
```

0 commit comments

Comments
 (0)