A hybrid robotics project that combines:
- ROS 2 (Python) perception + high-level control on a NavQ+ Linux companion computer.
- Bare-metal C motor/servo control on an NXP K64F microcontroller.
- UART bridge + Bluetooth manual commands for switching between manual and AI-assisted driving.
The current implementation performs object detection with TensorFlow Lite, selects a primary target bounding box, and streams either manual command bytes or <BBOX,...> AI messages to the K64F firmware.
src/navq_vision/— ROS 2 package (ament_python) containing camera, detection, Bluetooth, and UART nodes.launch/autobot_launch.py— launch script intended to start core ROS nodes.startup.sh— helper script that sources ROS 2 and launches the stack.128 Project/Sources/main.c— bare-metal firmware for the K64F (motor PWM, servo steering, UART parsing, AI/manual mode state machine).*.tflite,model/correct— model artifacts used by the vision pipeline.
bluetooth_node.pyreads incoming command integers from/dev/rfcomm0.- It publishes each value to ROS topic
bluetooth_commands(std_msgs/Int32). uart_node.pyforwards those bytes to/dev/ttymxc2for the K64F.- K64F firmware interprets command bytes
0..10and drives motors/servo accordingly.
object_detection_node.pycaptures frames from camera index3(640x480).- It runs TFLite inference (attempting
libvx_delegate.soNPU delegate). - The largest-confidence-valid object box is published as
ai_bboxes(Int32MultiArray:[x, y, w, h]). uart_node.pyserializes that as<BBOX,x,y,w,h>and sends it over UART.- K64F firmware parses BBOX messages and steers left/right/straight based on bbox center, deadband, and hysteresis.
AI mode is entered by sending command
1(CMD_AI) and exited implicitly when non-AI manual commands are used.
-
object_detection_publisher(object_detection_node.py)- Publishes:
camera_with_boxes(sensor_msgs/CompressedImage)ai_bboxes(std_msgs/Int32MultiArray)
- Uses model path hardcoded to
/home/user/ros2_ws/model/correct.
- Publishes:
-
video_publisher(video_publisher.py)- A more general detector/overlay node with parameters for model type (
ssd/yolov8), compression, thresholds, and optional depth estimation. - Publishes detections and overlay images.
- A more general detector/overlay node with parameters for model type (
-
bluetooth_publisher(bluetooth_node.py)- Reads serial Bluetooth input from
/dev/rfcomm0at 9600 and publishesbluetooth_commands.
- Reads serial Bluetooth input from
-
uart_publisher(uart_node.py)- Reads
bluetooth_commandsandai_bboxes. - Sends control data over
/dev/ttymxc2at 115200. - Tracks AI vs manual state locally.
- Reads
-
camera_publisher(camera_node.py)- Basic camera publisher for raw/compressed image experimentation.
| Name | Value |
|---|---|
CMD_OFF |
0 |
CMD_AI |
1 |
CMD_STOP |
2 |
CMD_FORWARD_STRAIGHT |
3 |
CMD_FORWARD_RIGHT |
4 |
CMD_FORWARD_LEFT |
5 |
CMD_BACKWARD_STRAIGHT / CMD_REVERSE_STRAIGHT |
6 |
CMD_BACKWARD_RIGHT / CMD_REVERSE_RIGHT |
7 |
CMD_BACKWARD_LEFT / CMD_REVERSE_LEFT |
8 |
CMD_RIGHT |
9 |
CMD_LEFT |
10 |
- Right motor direction:
PTD0/PTD1, speed PWM:PTD2(FTM3 CH2) - Left motor direction:
PTD3/PTD4, speed PWM:PTD5(FTM0 CH5) - Steering servo:
PTC10(FTM3 CH6) - UART0: debug (
PTB16/PTB17, 9600) - UART1: NavQ+ link (
PTC3/PTC4, 115200)
- Maintains mode state:
MANUALorAI. - Manual mode: byte commands directly map to motion primitives.
- AI mode:
- Accepts framed ASCII messages:
<BBOX,x,y,w,h>. - Computes bbox center against frame midpoint (
640/2) with deadband. - Applies hysteresis before turning direction changes.
- Commands steering + forward drive at fixed AI speed.
- Accepts framed ASCII messages:
- PIT safety interrupt stops motors when no command was received in the last interval.
Single raw byte:
0..10
ASCII framed packet:
<BBOX,x,y,w,h>
Example:
<BBOX,210,120,140,190>
- Ubuntu + ROS 2 Humble
- Python deps used in code:
rclpy,sensor_msgs,vision_msgs,cv_bridgeopencv-python,numpy,pyserial,tflite-runtime
- Access to serial devices:
/dev/rfcomm0(Bluetooth)/dev/ttymxc2(K64F UART)
- TFLite model file at path expected by your node (currently hardcoded in some files).
From workspace root where src/ lives:
colcon build --packages-select navq_vision
source install/setup.bashUsing script:
bash startup.shOr direct ROS launch:
ros2 launch navq_vision autobot_launch.py- Open the
128 ProjectMCU project in your NXP IDE/toolchain. - Build and flash to MK64F12 target.