RADICAL AsyncFlow (RAF) is a fast asynchronous scripting library built on top of asyncio for building powerful async/sync workflows on HPC, clusters, and local machines. It supports pluggable execution backends with intuitive task dependencies and workflow composition.
-
⚡ Powerful asynchronous workflows — Compose complex async and sync workflows easily, with intuitive task dependencies and campaign orchestration.
-
🌐 Portable across environments — Run seamlessly on HPC systems, clusters, and local machines with pluggable execution backends.
-
🧩 Flexible and extensible — Supports campaign management and advanced workflow patterns, built on Python’s asyncio and RADICAL Cybertools expertise.
AsyncFlow ships with the following built-in execution backends:
LocalExecutionBackend— local execution using Python's concurrent.futures (ThreadPoolExecutor / ProcessPoolExecutor)NoopExecutionBackend— no-op backend for testing anddry_runmode
For HPC execution, install RHAPSODY which provides additional backends that plug directly into AsyncFlow:
- Radical.Pilot — distributed HPC execution across supercomputers and clusters
- Dask — parallel computing with Dask distributed
- Concurrent — thread/process pool execution with extended HPC support
- Dragon — high-performance distributed execution
Radical AsyncFlow package is available on PyPI.
pip install radical-asyncflow
For HPC execution via RHAPSODY:
pip install rhapsody-py
For developers:
git clone https://github.com/radical-cybertools/radical.asyncflow
cd radical.asyncflow
pip install -e .[dev,lint,doc]👉 AsyncFlow Documentation and API References
import asyncio
from radical.asyncflow import WorkflowEngine, LocalExecutionBackend
from concurrent.futures import ThreadPoolExecutor
async def main():
# Create backend and workflow
backend = await LocalExecutionBackend(ThreadPoolExecutor())
flow = await WorkflowEngine.create(backend=backend)
@flow.executable_task
async def task1():
return "/bin/echo 5"
@flow.function_task
async def task2(t1_result):
return int(t1_result.strip()) * 2 * 2
# create the workflow
t1_fut = task1()
t2_result = await task2(t1_fut) # t2 depends on t1 (waits for it)
print(t2_result)
# shutdown the execution backend
await flow.shutdown()
if __name__ == "__main__":
asyncio.run(main())- AI & LLM Workflows - Build complex AI agent systems and orchestrate multiple language model calls with automatic dependency resolution in parallel.
- Data Processing Pipelines - Create data science pipelines, and real-time analytics with async task coordination.
- High-Performance Computing - Execute scientific computing workflows and distributed simulations on HPC clusters with scaling.
- Cross-Platform Execution - Deploy the same workflows locally for development, or HPC infrastructure without code changes