Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions assets/contributors.csv
Original file line number Diff line number Diff line change
Expand Up @@ -116,3 +116,4 @@ Richard Burton,Arm,Burton2000,,,
Brendan Long,Arm,bccbrendan,https://www.linkedin.com/in/brendan-long-5817924/,,
Asier Arranz,NVIDIA,,asierarranz,,asierarranz.com
Prince Agyeman,Arm,,,,
Kavya Sri Chennoju,Arm,kavya-chennoju,kavya-sri-chennoju,,
Original file line number Diff line number Diff line change
Expand Up @@ -50,6 +50,14 @@ further_reading:
title: Device Connect integration guide
link: https://github.com/atsyplikhin/robots/blob/feat/device-connect-integration-draft/strands_robots/device_connect/GUIDE.md
type: website
- resource:
title: device-connect-agent-tools on PyPI
link: https://pypi.org/project/device-connect-agent-tools
type: website
- resource:
title: device-connect-sdk on PyPI
link: https://pypi.org/project/device-connect-sdk
type: website

### FIXED, DO NOT MODIFY
# ================================================================================
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,25 +6,25 @@ weight: 2
layout: learningpathall
---

## Why connect AI agents to edge devices?
## Physical AI starts with connectivity

Arm processors are at the heart of a remarkable range of systems - from Cortex-M microcontrollers in industrial sensors to Neoverse servers running in the cloud. That breadth of hardware is one of Arm's greatest strengths, but it raises a practical question for AI developers: how do you give an agent structured, safe access to devices that are physically distributed and built on different software stacks?
Arm processors are at the heart of a remarkable range of systems - from Cortex-M microcontrollers in industrial sensors to Neoverse servers running in the cloud. That breadth of hardware is one of Arms greatest strengths, but it raises a practical question for AI developers: how do you give an agent structured, safe access to devices that are physically distributed and built on different software stacks?

Device Connect is Arm's answer to that question. It's a platform layer that handles device registration, discovery, and remote procedure calls across a network of devices, with no bespoke networking code required. Strands is an open-source agent SDK from AWS that takes a model-driven approach to building AI agents - an LLM calls Python tools in a structured reasoning loop, and the SDK handles the rest. When you combine them, an agent can ask "which devices are online and what can they do?" and then invoke a function on a specific device, turning natural language intent into physical action.
Natural language is becoming more than a software interface. Physical AI systems — robots, sensors, actuators — can now sense, decide, and act based on instructions from an LLM agent. But for that to work, the devices need to be reachable. They need a shared infrastructure for discovery, communication, and coordination.

This Learning Path puts both tools through their paces. It starts with a single machine, for example a laptop, where a simulated robot and an agent discover each other automatically, then extends to a two-machine setup where a Raspberry Pi joins the same device mesh over the network.
[Device Connect](https://github.com/arm/device-connect) is Arm's device-aware framework for exactly that. Once devices register through a shared mesh, agents can discover and command any of them without caring where they run. A fleet of robot arms, a network of sensors, or a mix of physical and simulated devices all become equally reachable.

## Device Connect architecture layers
[Strands Robots](https://github.com/strands-labs/robots) is a robot SDK that integrates Device Connect with the [AWS Strands Agents SDK](https://strandsagents.com/). Using Strands, an LLM can query the device mesh ("who's available?"), understand what each device can do, and dynamically invoke actions — turning natural language intent into real-world outcomes.

**Device layer**
This Learning Path starts on a single machine, where a simulated robot and an agent discover each other automatically, then optionally extends to a Raspberry Pi joining the same device mesh over the network.

A device is any process that registers itself on the mesh and exposes callable functions. In this Learning Path you'll create a simulated robot arm, namely the simulated robotic arm SO-100 from Hugging Face, from the `strands-robots` SDK. The moment this object is created, it registers on the local network under a unique device ID (for example, `so100_sim-abc23`) and begins publishing a presence heartbeat. No explicit registration call is required. Device Connect uses Zenoh as its underlying messaging transport, which handles low-level connectivity and routing automatically.
## How the pieces fit together

**Agent layer**
Two packages make this work.

Two interfaces sit at this layer. The `device-connect-agent-tools` package exposes `discover_devices()` and `invoke_device()` as plain Python functions you can call directly from a script or REPL, with no LLM involved. The `robot_mesh` tool from `robots` wraps the same capabilities as a Strands agent tool, which means an LLM can also call them during a reasoning loop. Both share the same underlying Device Connect transport, so anything you can do with one you can do with the other.
**`device-connect-sdk`** is the device-side runtime. Any process that wraps a robot in a `DeviceDriver` and starts a `DeviceRuntime` joins the mesh: it registers itself, announces what it can do, and starts publishing state events. Zenoh handles low-level connectivity; in device-to-device mode no broker or environment configuration is needed.

The diagram below shows how these layers communicate at runtime:
**`device-connect-agent-tools`** is the agent-side runtime. It exposes `discover_devices()` and `invoke_device()` as plain Python functions. The `robot_mesh` tool in Strands Robots wraps the same interface as a Strands tool, so an LLM can call it too. Both use the same underlying transport, so the same calls work whether the caller is a script or an agent.

```
┌──────────────────────────────────────┐
Expand All @@ -42,21 +42,15 @@ The diagram below shows how these layers communicate at runtime:
└──────────────────────────────────────┘
```

## How device discovery works
## What a device exposes

When the `SO-100 arm` instance starts, Device Connect automatically announces the device on the local network. Any process running `discover_devices()` or `robot_mesh(action='peers')` on the same network will hear the announcement and add the device to its live table of available hardware.
This Learning Path uses the SO-100 arm, a simulated robot arm from Hugging Face. When `Robot('so100').run()` starts, it registers on the mesh and exposes three callable functions. These are what `invoke_device()` on the agent side targets — calling `invoke_device("so100-abc123", "execute", {...})` routes a request over Zenoh to the robot process and executes the function there, returning the result back to the caller:

## What the simulated robot provides
- `execute` — send a natural language instruction and a policy provider to the robot
- `getStatus` — query what the robot is currently doing
- `stop` — halt the current task, or `emergency_stop` to halt every device on the mesh at once

When you run `Robot('so100')`, the SDK downloads the MuJoCo physics model for the SO-100 arm (this happens once on first run) and starts a local simulation. The robot exposes three functions that any agent can call via RPC:

- `execute` - start a task with a given instruction and policy provider
- `getStatus` - query the current task state
- `stop` - halt the current task

For this Learning Path, the `policy_provider='mock'` argument is used, which means `execute` accepts the call and returns `{'status': 'accepted'}` without actually running a motion policy. This keeps the focus on the connectivity and invocation patterns rather than robotics.

Once you have the flow working end to end, replacing `'mock'` with a real policy is a one-line change.
A motion policy is the component that translates a high-level instruction like "pick up the cube" into a sequence of joint movements. Different policy providers connect to different backends — from local model inference to remote policy servers. For this Learning Path, `policy_provider='mock'` is used, so `execute` accepts the task and returns immediately without running real motion. Replacing `'mock'` with a real provider like `'lerobot_local'` or `'groot'` is a one-line change once you have the connectivity working.

## What you'll learn in this Learning Path

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ This section runs Device Connect's device-to-device discovery. There are two way

### Option 1: run on a single machine

For a proof-of-concept, follow the steps using two terminal windows on your machine with the virtual environment set up.
For a conceptual implementation, follow the steps using two terminal windows on your machine with the virtual environment set up.

### Option 2: run with real hardware

Expand Down Expand Up @@ -45,12 +45,22 @@ python <<'PY'
import logging
logging.basicConfig(level=logging.INFO)
from strands_robots import Robot
r = Robot('so100')
r = Robot('so100', peer_id='so100-abc123')
r.run()
PY
```

When the `Robot('so100')` object is created, the SDK downloads the MuJoCo physics model for the SO-100 arm. This download happens only on the first run and takes a minute or two. After that, it starts the simulation and registers the robot on the Device Connect device mesh. The robot publishes a presence heartbeat every 0.5 seconds under a unique device ID, for example `so100-abc123`.
Two things happen when this script runs.

**`Robot('so100')`** calls the factory function, which checks for USB servo hardware and, finding none, creates a MuJoCo `Simulation` instance. On the first run it downloads the SO-100 MJCF physics model from Hugging Face — this can take up to 20 minutes depending on your connection. Subsequent runs use the local cache and start immediately.

**`r.run()`** calls `init_device_connect_sync()`, which does the following:

1. Creates a `SimulationDeviceDriver` — a Device Connect `DeviceDriver` adapter that wraps the simulation and maps its methods to structured RPCs.
2. Starts a `DeviceRuntime` with the Zenoh D2D backend. No broker or environment variables are needed; devices discover each other on the LAN via Zenoh multicast scouting.
3. Subscribes the device to its command topic: `device-connect.default.<PEER_ID>.cmd`.
4. Registers RPC handlers: `execute`, `getStatus`, `getFeatures`, `step`, `reset`, and `stop`.
5. Starts a 10Hz background loop that emits `stateUpdate` and `observationUpdate` events to any listener on the mesh.

You should see INFO-level log output similar to:

Expand All @@ -64,6 +74,17 @@ device_connect_sdk.device.so100-abc123 - INFO - Subscribed to commands on device

Leave this process running. The simulated robot is only discoverable as long as this process is alive.

## Open a second terminal

Leave terminal 1 running with the robot process. Open a new terminal window, navigate to the repository, and activate the virtual environment:

```bash
cd ~/strands-device-connect/robots
source .venv/bin/activate
```

Run all remaining commands in this section from this second terminal.

## Control the robot using the robot_mesh Strands tool

The `robot_mesh` tool wraps the same discovery and invocation primitives as a Strands agent tool. You can call it directly from a Python script or attach it to an LLM agent; the API is identical either way.
Expand All @@ -79,6 +100,8 @@ print(robot_mesh(action='peers'))
PY
```

When this runs, `robot_mesh` calls `_ensure_connected()`, which sets `MESSAGING_BACKEND=zenoh` (if the environment variable is not already set) and opens the agent-side Zenoh connection via `device_connect_agent_tools`. It then calls `conn.list_devices()`, which queries the Zenoh network for all registered Device Connect devices. Each device registers its `device_id`, `device_type`, availability from its `DeviceStatus`, and the list of RPC functions it exposes. The tool formats this into the human-readable summary below.

The output is similar to:

```output
Expand All @@ -105,6 +128,8 @@ print(robot_mesh(
PY
```

Under the hood, `robot_mesh` calls `conn.invoke(target, "execute", params)`, which serializes the arguments and routes them over Zenoh to the device's command topic: `device-connect.default.<PEER_ID>.cmd`. The `SimulationDeviceDriver.execute()` RPC handler on the robot side receives the call, resolves the robot name inside the simulation world, and calls `sim.start_policy(instruction=..., policy_provider='mock', ...)`. With the mock policy provider, the handler returns immediately with a success result without running real motion — the call is a connectivity and RPC round-trip test. The `stateUpdate` events you'll see in terminal 1 are published by the separate 10Hz background loop that was started by `r.run()`.

You will see the following output:

```output
Expand Down Expand Up @@ -135,6 +160,8 @@ print(robot_mesh(action='emergency_stop'))
PY
```

This doesn't send a single broadcast message. Instead, `robot_mesh` first calls `conn.list_devices()` to enumerate every device currently on the mesh, then calls `conn.invoke(device_id, "stop", timeout=3.0)` on each one in sequence. On the device side, `SimulationDeviceDriver.stop()` sets `policy_running = False` for every robot in the simulation world and returns immediately. Failures per device are swallowed so that a single unresponsive device doesn't block the rest from stopping.

The output is similar to:

```output
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,6 @@ docker compose version
```bash
cd ~/strands-device-connect
git clone --depth 1 https://github.com/arm/device-connect.git

```

## Machine and terminal layout
Expand All @@ -46,7 +45,6 @@ In host terminal 1, bring up the Device Connect infrastructure stack. The Compos
```bash
cd ~/strands-device-connect/device-connect/packages/device-connect-server
docker compose -f infra/docker-compose-dev.yml up -d
cd ../../..
```

Confirm the services are healthy:
Expand Down Expand Up @@ -96,7 +94,7 @@ Replace `HOST_IP` with the address you noted in Step 2. `DEVICE_CONNECT_ALLOW_IN

## Step 4 - start the robot on the Raspberry Pi

On the Raspberry Pi, with the environment active and the variables set, start the simulated SO-100 robot:
On the Raspberry Pi, with the environment active and the variables set in your `robots` directory, start the simulated SO-100 robot:

```python
python <<'PY'
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,26 +15,25 @@ python3.12 --version
git --version
```

These instructions are tested on Python 3.12. Earlier versions of Python 3 may work but are not validated against the `feat/device-connect-integration-draft` branch used in this Learning Path.
These instructions are tested on Python 3.12. Earlier versions of Python 3 may work but are not validated against the `dev` branch used in this Learning Path.

## Clone the repository

The code run in this Learning Path sits in the `robots` repository. It contains the robot runtime and the `robot_mesh` Strands tool.
The code run in this Learning Path sits in a branch of the `robots` repository. It contains the robot runtime and the `robot_mesh` Strands tool.

```bash
mkdir ~/strands-device-connect
cd strands-device-connect
git clone https://github.com/atsyplikhin/robots.git
git clone https://github.com/strands-labs/robots.gits
```

## Check out the integration branch

The Device Connect integration code for `robots` lives on the `feat/device-connect-integration-draft` branch. This branch adds the `RobotDeviceDriver` adapter and the updated `robot_mesh` tool that routes calls through the Device Connect SDK rather than the raw Zenoh mesh.
The Device Connect integration code for `robots` lives on the `dev` branch. This branch adds the `RobotDeviceDriver` adapter and the updated `robot_mesh` tool that routes calls through the Device Connect SDK rather than the raw Zenoh mesh.

```bash
cd ~/strands-device-connect/robots
git checkout feat/device-connect-integration-draft
cd ..
git switch dev
```

## Create a Python virtual environment
Expand Down Expand Up @@ -63,7 +62,7 @@ This means discovery works as long as the device process and the agent process a

At this point you've:

- Cloned `robots` with the `feat/device-connect-integration-draft` branch checked out.
- Cloned `robots` with the `dev` branch checked out.
- Created a Python 3.12 virtual environment with the Device Connect SDK, agent tools, and robot simulation runtime all installed.

The next section walks you through starting a simulated robot and invoking it from both the agent tools and the `robot_mesh` Strands tool.
Loading