From 30775ce608c8a8ae079242ba6d3d848dbcf97bab Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Tue, 17 Mar 2026 16:36:55 -0700 Subject: [PATCH 1/7] Update framing, incorporate feedback --- assets/contributors.csv | 1 + .../device-connect-strands/background.md | 38 ++++++++----------- .../device-connect-strands/run-example.md | 33 +++++++++++++++- .../run-infra-example.md | 1 - .../device-connect-strands/setup.md | 1 - 5 files changed, 47 insertions(+), 27 deletions(-) diff --git a/assets/contributors.csv b/assets/contributors.csv index d0ce9ef7a2..5af85583e1 100644 --- a/assets/contributors.csv +++ b/assets/contributors.csv @@ -116,3 +116,4 @@ Richard Burton,Arm,Burton2000,,, Brendan Long,Arm,bccbrendan,https://www.linkedin.com/in/brendan-long-5817924/,, Asier Arranz,NVIDIA,,asierarranz,,asierarranz.com Prince Agyeman,Arm,,,, +Kavya Sri Chennoju,Arm,kavya-chennoju,kavya-sri-chennoju,, diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md index 5f5b1e9328..3258ccd083 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md @@ -6,25 +6,23 @@ weight: 2 layout: learningpathall --- -## Why connect AI agents to edge devices? +## Physical AI starts with connectivity -Arm processors are at the heart of a remarkable range of systems - from Cortex-M microcontrollers in industrial sensors to Neoverse servers running in the cloud. That breadth of hardware is one of Arm's greatest strengths, but it raises a practical question for AI developers: how do you give an agent structured, safe access to devices that are physically distributed and built on different software stacks? +Natural language is becoming more than a software interface. Physical AI systems — robots, sensors, actuators — can now sense, decide, and act based on instructions from an LLM agent. But for that to work, the devices need to be reachable. They need a shared infrastructure for discovery, communication, and coordination. -Device Connect is Arm's answer to that question. It's a platform layer that handles device registration, discovery, and remote procedure calls across a network of devices, with no bespoke networking code required. Strands is an open-source agent SDK from AWS that takes a model-driven approach to building AI agents - an LLM calls Python tools in a structured reasoning loop, and the SDK handles the rest. When you combine them, an agent can ask "which devices are online and what can they do?" and then invoke a function on a specific device, turning natural language intent into physical action. +[Device Connect](https://github.com/arm/device-connect) is Arm's device-aware framework for exactly that. Once devices register through a shared mesh, agents can discover and command any of them without caring where they run. A fleet of robot arms, a network of sensors, or a mix of physical and simulated devices all become equally reachable. -This Learning Path puts both tools through their paces. It starts with a single machine, for example a laptop, where a simulated robot and an agent discover each other automatically, then extends to a two-machine setup where a Raspberry Pi joins the same device mesh over the network. +[Strands Robots](https://github.com/strands-labs/robots) is a robot SDK that integrates Device Connect with the [AWS Strands Agents SDK](https://github.com/strands-labs/sdk). Using Strands, an LLM can query the device mesh ("who's available?"), understand what each device can do, and dynamically invoke actions — turning natural language intent into real-world outcomes. -## Device Connect architecture layers +This Learning Path starts on a single machine, where a simulated robot and an agent discover each other automatically, then optionally extends to a Raspberry Pi joining the same device mesh over the network. -**Device layer** +## How the pieces fit together -A device is any process that registers itself on the mesh and exposes callable functions. In this Learning Path you'll create a simulated robot arm, namely the simulated robotic arm SO-100 from Hugging Face, from the `strands-robots` SDK. The moment this object is created, it registers on the local network under a unique device ID (for example, `so100_sim-abc23`) and begins publishing a presence heartbeat. No explicit registration call is required. Device Connect uses Zenoh as its underlying messaging transport, which handles low-level connectivity and routing automatically. +Two packages make this work. -**Agent layer** +**`device_connect_sdk`** is the device-side runtime. Any process that wraps a robot in a `DeviceDriver` and starts a `DeviceRuntime` joins the mesh: it registers itself, announces what it can do, and starts publishing state events. Zenoh handles low-level connectivity; in device-to-device mode no broker or environment configuration is needed. -Two interfaces sit at this layer. The `device-connect-agent-tools` package exposes `discover_devices()` and `invoke_device()` as plain Python functions you can call directly from a script or REPL, with no LLM involved. The `robot_mesh` tool from `robots` wraps the same capabilities as a Strands agent tool, which means an LLM can also call them during a reasoning loop. Both share the same underlying Device Connect transport, so anything you can do with one you can do with the other. - -The diagram below shows how these layers communicate at runtime: +**`device-connect-agent-tools`** is the agent-side runtime. It exposes `discover_devices()` and `invoke_device()` as plain Python functions. The `robot_mesh` tool in Strands Robots wraps the same interface as a Strands tool, so an LLM can call it too. Both use the same underlying transport, so the same calls work whether the caller is a script or an agent. ``` ┌──────────────────────────────────────┐ @@ -42,21 +40,15 @@ The diagram below shows how these layers communicate at runtime: └──────────────────────────────────────┘ ``` -## How device discovery works - -When the `SO-100 arm` instance starts, Device Connect automatically announces the device on the local network. Any process running `discover_devices()` or `robot_mesh(action='peers')` on the same network will hear the announcement and add the device to its live table of available hardware. - -## What the simulated robot provides - -When you run `Robot('so100')`, the SDK downloads the MuJoCo physics model for the SO-100 arm (this happens once on first run) and starts a local simulation. The robot exposes three functions that any agent can call via RPC: +## What a device exposes -- `execute` - start a task with a given instruction and policy provider -- `getStatus` - query the current task state -- `stop` - halt the current task +This Learning Path uses the SO-100 arm, a simulated robot arm from Hugging Face. When a `Robot('so100').run()` is initiated in the Device Connect framework, it registers on the mesh and exposes three key callable functions to any agent: -For this Learning Path, the `policy_provider='mock'` argument is used, which means `execute` accepts the call and returns `{'status': 'accepted'}` without actually running a motion policy. This keeps the focus on the connectivity and invocation patterns rather than robotics. +- `execute` — send a natural language instruction and a policy provider to the robot +- `getStatus` — query what the robot is currently doing +- `stop` — halt the current task, or `emergency_stop` to halt every device on the mesh at once -Once you have the flow working end to end, replacing `'mock'` with a real policy is a one-line change. +For this Learning Path, `policy_provider='mock'` is used, so `execute` accepts the task and returns immediately without running real motion. Replacing `'mock'` with a real provider like `'lerobot_local'` or `'groot'` is a one-line change once you have the connectivity working. ## What you'll learn in this Learning Path diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md index 759306eac5..9a796f4d02 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md @@ -45,12 +45,24 @@ python <<'PY' import logging logging.basicConfig(level=logging.INFO) from strands_robots import Robot -r = Robot('so100') +r = Robot('so100', peer_id='so100-abc123') r.run() PY ``` -When the `Robot('so100')` object is created, the SDK downloads the MuJoCo physics model for the SO-100 arm. This download happens only on the first run and takes a minute or two. After that, it starts the simulation and registers the robot on the Device Connect device mesh. The robot publishes a presence heartbeat every 0.5 seconds under a unique device ID, for example `so100-abc123`. +Two things happen when this script runs. + +**`Robot('so100')`** calls the factory function, which checks for USB servo hardware and, finding none, creates a MuJoCo `Simulation` instance. On the first run it downloads the SO-100 MJCF physics model from Hugging Face — this can take up to 20 minutes depending on your connection. Subsequent runs use the local cache and start immediately. + +**`r.run()`** calls `init_device_connect_sync()`, which does the following: + +1. Creates a `SimulationDeviceDriver` — a Device Connect `DeviceDriver` adapter that wraps the simulation and maps its methods to structured RPCs. +2. Starts a `DeviceRuntime` with the Zenoh D2D backend. No broker or environment variables are needed; devices discover each other on the LAN via Zenoh multicast scouting. +3. Subscribes the device to its command topic: `device-connect.default..cmd`. +4. Registers RPC handlers: `execute`, `getStatus`, `getFeatures`, `step`, `reset`, and `stop`. +5. Starts a 10Hz background loop that emits `stateUpdate` and `observationUpdate` events to any listener on the mesh. + +The process then blocks in a loop, keeping the device registered and reachable. The robot is only discoverable for as long as this process is running. You should see INFO-level log output similar to: @@ -64,6 +76,17 @@ device_connect_sdk.device.so100-abc123 - INFO - Subscribed to commands on device Leave this process running. The simulated robot is only discoverable as long as this process is alive. +## Open a second terminal + +Leave terminal 1 running with the robot process. Open a new terminal window, navigate to the repository, and activate the virtual environment: + +```bash +cd ~/strands-device-connect/robots +source .venv/bin/activate +``` + +Run all remaining commands in this section from this second terminal. + ## Control the robot using the robot_mesh Strands tool The `robot_mesh` tool wraps the same discovery and invocation primitives as a Strands agent tool. You can call it directly from a Python script or attach it to an LLM agent; the API is identical either way. @@ -79,6 +102,8 @@ print(robot_mesh(action='peers')) PY ``` +When this runs, `robot_mesh` calls `_ensure_connected()`, which sets `MESSAGING_BACKEND=zenoh` (if the environment variable is not already set) and opens the agent-side Zenoh connection via `device_connect_agent_tools`. It then calls `conn.list_devices()`, which queries the Zenoh network for all registered Device Connect devices. Each device registers its `device_id`, `device_type`, availability from its `DeviceStatus`, and the list of RPC functions it exposes. The tool formats this into the human-readable summary below. + The output is similar to: ```output @@ -105,6 +130,8 @@ print(robot_mesh( PY ``` +Under the hood, `robot_mesh` calls `conn.invoke(target, "execute", params)`, which serializes the arguments and routes them over Zenoh to the device's command topic: `device-connect.default..cmd`. The `SimulationDeviceDriver.execute()` RPC handler on the robot side receives the call, resolves the robot name inside the simulation world, and calls `sim.start_policy(instruction=..., policy_provider='mock', ...)`. With the mock policy provider, the handler returns immediately with a success result without running real motion — the call is a connectivity and RPC round-trip test. The `stateUpdate` events you'll see in terminal 1 are published by the separate 10Hz background loop that was started by `r.run()`. + You will see the following output: ```output @@ -135,6 +162,8 @@ print(robot_mesh(action='emergency_stop')) PY ``` +This doesn't send a single broadcast message. Instead, `robot_mesh` first calls `conn.list_devices()` to enumerate every device currently on the mesh, then calls `conn.invoke(device_id, "stop", timeout=3.0)` on each one in sequence. On the device side, `SimulationDeviceDriver.stop()` sets `policy_running = False` for every robot in the simulation world and returns immediately. Failures per device are swallowed so that a single unresponsive device doesn't block the rest from stopping. + The output is similar to: ```output diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md index 7d3adbc131..12d9984bb4 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md @@ -26,7 +26,6 @@ docker compose version ```bash cd ~/strands-device-connect git clone --depth 1 https://github.com/arm/device-connect.git - ``` ## Machine and terminal layout diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md index 9730e2282a..0b8a7baf1f 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md @@ -34,7 +34,6 @@ The Device Connect integration code for `robots` lives on the `feat/device-conne ```bash cd ~/strands-device-connect/robots git checkout feat/device-connect-integration-draft -cd .. ``` ## Create a Python virtual environment From 26d58efff3c2eccb64b26314a336adf56cba8c5d Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Tue, 17 Mar 2026 16:43:00 -0700 Subject: [PATCH 2/7] Address more feedback --- .../device-connect-strands/background.md | 4 ++-- .../device-connect-strands/run-example.md | 2 +- .../device-connect-strands/run-infra-example.md | 3 +-- 3 files changed, 4 insertions(+), 5 deletions(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md index 3258ccd083..f3cb919bec 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md @@ -42,13 +42,13 @@ Two packages make this work. ## What a device exposes -This Learning Path uses the SO-100 arm, a simulated robot arm from Hugging Face. When a `Robot('so100').run()` is initiated in the Device Connect framework, it registers on the mesh and exposes three key callable functions to any agent: +This Learning Path uses the SO-100 arm, a simulated robot arm from Hugging Face. When `Robot('so100').run()` starts, it registers on the mesh and exposes three callable functions. These are what `invoke_device()` on the agent side targets — calling `invoke_device("so100-abc123", "execute", {...})` routes a request over Zenoh to the robot process and executes the function there, returning the result back to the caller: - `execute` — send a natural language instruction and a policy provider to the robot - `getStatus` — query what the robot is currently doing - `stop` — halt the current task, or `emergency_stop` to halt every device on the mesh at once -For this Learning Path, `policy_provider='mock'` is used, so `execute` accepts the task and returns immediately without running real motion. Replacing `'mock'` with a real provider like `'lerobot_local'` or `'groot'` is a one-line change once you have the connectivity working. +A motion policy is the component that translates a high-level instruction like "pick up the cube" into a sequence of joint movements. Different policy providers connect to different backends — from local model inference to remote policy servers. For this Learning Path, `policy_provider='mock'` is used, so `execute` accepts the task and returns immediately without running real motion. Replacing `'mock'` with a real provider like `'lerobot_local'` or `'groot'` is a one-line change once you have the connectivity working. ## What you'll learn in this Learning Path diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md index 9a796f4d02..8be254550a 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md @@ -12,7 +12,7 @@ This section runs Device Connect's device-to-device discovery. There are two way ### Option 1: run on a single machine -For a proof-of-concept, follow the steps using two terminal windows on your machine with the virtual environment set up. +For a conceptual implementation, follow the steps using two terminal windows on your machine with the virtual environment set up. ### Option 2: run with real hardware diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md index 12d9984bb4..a466aeaf54 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-infra-example.md @@ -45,7 +45,6 @@ In host terminal 1, bring up the Device Connect infrastructure stack. The Compos ```bash cd ~/strands-device-connect/device-connect/packages/device-connect-server docker compose -f infra/docker-compose-dev.yml up -d -cd ../../.. ``` Confirm the services are healthy: @@ -95,7 +94,7 @@ Replace `HOST_IP` with the address you noted in Step 2. `DEVICE_CONNECT_ALLOW_IN ## Step 4 - start the robot on the Raspberry Pi -On the Raspberry Pi, with the environment active and the variables set, start the simulated SO-100 robot: +On the Raspberry Pi, with the environment active and the variables set in your `robots` directory, start the simulated SO-100 robot: ```python python <<'PY' From 9a7715871f04d6ac176317b07c96be348612a321 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Wed, 18 Mar 2026 10:26:06 -0700 Subject: [PATCH 3/7] Additional resources, final edits --- .../device-connect-strands/_index.md | 8 ++++++++ .../device-connect-strands/background.md | 2 +- .../device-connect-strands/run-example.md | 2 -- .../device-connect-strands/setup.md | 2 +- 4 files changed, 10 insertions(+), 4 deletions(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/_index.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/_index.md index 34ab1ff490..2764dfec9d 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/_index.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/_index.md @@ -50,6 +50,14 @@ further_reading: title: Device Connect integration guide link: https://github.com/atsyplikhin/robots/blob/feat/device-connect-integration-draft/strands_robots/device_connect/GUIDE.md type: website + - resource: + title: device-connect-agent-tools on PyPI + link: https://pypi.org/project/device-connect-agent-tools + type: website + - resource: + title: device-connect-sdk on PyPI + link: https://pypi.org/project/device-connect-sdk + type: website ### FIXED, DO NOT MODIFY # ================================================================================ diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md index f3cb919bec..4084dea5f0 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md @@ -20,7 +20,7 @@ This Learning Path starts on a single machine, where a simulated robot and an ag Two packages make this work. -**`device_connect_sdk`** is the device-side runtime. Any process that wraps a robot in a `DeviceDriver` and starts a `DeviceRuntime` joins the mesh: it registers itself, announces what it can do, and starts publishing state events. Zenoh handles low-level connectivity; in device-to-device mode no broker or environment configuration is needed. +**`device-connect-sdk`** is the device-side runtime. Any process that wraps a robot in a `DeviceDriver` and starts a `DeviceRuntime` joins the mesh: it registers itself, announces what it can do, and starts publishing state events. Zenoh handles low-level connectivity; in device-to-device mode no broker or environment configuration is needed. **`device-connect-agent-tools`** is the agent-side runtime. It exposes `discover_devices()` and `invoke_device()` as plain Python functions. The `robot_mesh` tool in Strands Robots wraps the same interface as a Strands tool, so an LLM can call it too. Both use the same underlying transport, so the same calls work whether the caller is a script or an agent. diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md index 8be254550a..38cf9f9440 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/run-example.md @@ -62,8 +62,6 @@ Two things happen when this script runs. 4. Registers RPC handlers: `execute`, `getStatus`, `getFeatures`, `step`, `reset`, and `stop`. 5. Starts a 10Hz background loop that emits `stateUpdate` and `observationUpdate` events to any listener on the mesh. -The process then blocks in a loop, keeping the device registered and reachable. The robot is only discoverable for as long as this process is running. - You should see INFO-level log output similar to: ```output diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md index 0b8a7baf1f..5ff6e9ae6a 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md @@ -19,7 +19,7 @@ These instructions are tested on Python 3.12. Earlier versions of Python 3 may w ## Clone the repository -The code run in this Learning Path sits in the `robots` repository. It contains the robot runtime and the `robot_mesh` Strands tool. +The code run in this Learning Path sits in a fork of the `robots` repository. It contains the robot runtime and the `robot_mesh` Strands tool. ```bash mkdir ~/strands-device-connect From b293ff9f0bd0005ec53008cce8972bf8bd607d9f Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Wed, 18 Mar 2026 12:59:46 -0700 Subject: [PATCH 4/7] Update link --- .../device-connect-strands/background.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md index 4084dea5f0..f57cd6824f 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md @@ -12,7 +12,7 @@ Natural language is becoming more than a software interface. Physical AI systems [Device Connect](https://github.com/arm/device-connect) is Arm's device-aware framework for exactly that. Once devices register through a shared mesh, agents can discover and command any of them without caring where they run. A fleet of robot arms, a network of sensors, or a mix of physical and simulated devices all become equally reachable. -[Strands Robots](https://github.com/strands-labs/robots) is a robot SDK that integrates Device Connect with the [AWS Strands Agents SDK](https://github.com/strands-labs/sdk). Using Strands, an LLM can query the device mesh ("who's available?"), understand what each device can do, and dynamically invoke actions — turning natural language intent into real-world outcomes. +[Strands Robots](https://github.com/strands-labs/robots) is a robot SDK that integrates Device Connect with the [AWS Strands Agents SDK](https://strandsagents.com/). Using Strands, an LLM can query the device mesh ("who's available?"), understand what each device can do, and dynamically invoke actions — turning natural language intent into real-world outcomes. This Learning Path starts on a single machine, where a simulated robot and an agent discover each other automatically, then optionally extends to a Raspberry Pi joining the same device mesh over the network. From 054c59aaee9832f60345e820d5a5c825dd73c2a5 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Wed, 18 Mar 2026 13:29:48 -0700 Subject: [PATCH 5/7] Add back blurb --- .../device-connect-strands/background.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md index f57cd6824f..70152d90db 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/background.md @@ -8,6 +8,8 @@ layout: learningpathall ## Physical AI starts with connectivity +Arm processors are at the heart of a remarkable range of systems - from Cortex-M microcontrollers in industrial sensors to Neoverse servers running in the cloud. That breadth of hardware is one of Arm’s greatest strengths, but it raises a practical question for AI developers: how do you give an agent structured, safe access to devices that are physically distributed and built on different software stacks? + Natural language is becoming more than a software interface. Physical AI systems — robots, sensors, actuators — can now sense, decide, and act based on instructions from an LLM agent. But for that to work, the devices need to be reachable. They need a shared infrastructure for discovery, communication, and coordination. [Device Connect](https://github.com/arm/device-connect) is Arm's device-aware framework for exactly that. Once devices register through a shared mesh, agents can discover and command any of them without caring where they run. A fleet of robot arms, a network of sensors, or a mix of physical and simulated devices all become equally reachable. From 87cd667cacf6cf169636265b453a400a29d69f52 Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Wed, 18 Mar 2026 14:08:12 -0700 Subject: [PATCH 6/7] Update URL --- .../device-connect-strands/setup.md | 4 ++-- 1 file changed, 2 insertions(+), 2 deletions(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md index 5ff6e9ae6a..9d67c3eabf 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md @@ -24,7 +24,7 @@ The code run in this Learning Path sits in a fork of the `robots` repository. It ```bash mkdir ~/strands-device-connect cd strands-device-connect -git clone https://github.com/atsyplikhin/robots.git +git clone https://github.com/strands-labs/robots.gits ``` ## Check out the integration branch @@ -33,7 +33,7 @@ The Device Connect integration code for `robots` lives on the `feat/device-conne ```bash cd ~/strands-device-connect/robots -git checkout feat/device-connect-integration-draft +git switch dev ``` ## Create a Python virtual environment From e5cfe1f8d0d27ce06d1d3dbe2e1cdeb133c71ebf Mon Sep 17 00:00:00 2001 From: Annie Tallund Date: Wed, 18 Mar 2026 14:08:55 -0700 Subject: [PATCH 7/7] Update repo URL --- .../device-connect-strands/setup.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md index 9d67c3eabf..c27f932b7b 100644 --- a/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md +++ b/content/learning-paths/embedded-and-microcontrollers/device-connect-strands/setup.md @@ -15,11 +15,11 @@ python3.12 --version git --version ``` -These instructions are tested on Python 3.12. Earlier versions of Python 3 may work but are not validated against the `feat/device-connect-integration-draft` branch used in this Learning Path. +These instructions are tested on Python 3.12. Earlier versions of Python 3 may work but are not validated against the `dev` branch used in this Learning Path. ## Clone the repository -The code run in this Learning Path sits in a fork of the `robots` repository. It contains the robot runtime and the `robot_mesh` Strands tool. +The code run in this Learning Path sits in a branch of the `robots` repository. It contains the robot runtime and the `robot_mesh` Strands tool. ```bash mkdir ~/strands-device-connect @@ -29,7 +29,7 @@ git clone https://github.com/strands-labs/robots.gits ## Check out the integration branch -The Device Connect integration code for `robots` lives on the `feat/device-connect-integration-draft` branch. This branch adds the `RobotDeviceDriver` adapter and the updated `robot_mesh` tool that routes calls through the Device Connect SDK rather than the raw Zenoh mesh. +The Device Connect integration code for `robots` lives on the `dev` branch. This branch adds the `RobotDeviceDriver` adapter and the updated `robot_mesh` tool that routes calls through the Device Connect SDK rather than the raw Zenoh mesh. ```bash cd ~/strands-device-connect/robots @@ -62,7 +62,7 @@ This means discovery works as long as the device process and the agent process a At this point you've: -- Cloned `robots` with the `feat/device-connect-integration-draft` branch checked out. +- Cloned `robots` with the `dev` branch checked out. - Created a Python 3.12 virtual environment with the Device Connect SDK, agent tools, and robot simulation runtime all installed. The next section walks you through starting a simulated robot and invoking it from both the agent tools and the `robot_mesh` Strands tool. \ No newline at end of file