Skip to content

audy910/Bare-Metal-Object-Ovoiding-Robot

Repository files navigation

Bare-Metal Object-Ovoiding Robot

A hybrid robotics project that combines:

  • ROS 2 (Python) perception + high-level control on a NavQ+ Linux companion computer.
  • Bare-metal C motor/servo control on an NXP K64F microcontroller.
  • UART bridge + Bluetooth manual commands for switching between manual and AI-assisted driving.

The current implementation performs object detection with TensorFlow Lite, selects a primary target bounding box, and streams either manual command bytes or <BBOX,...> AI messages to the K64F firmware.

Repository layout

  • src/navq_vision/ — ROS 2 package (ament_python) containing camera, detection, Bluetooth, and UART nodes.
  • launch/autobot_launch.py — launch script intended to start core ROS nodes.
  • startup.sh — helper script that sources ROS 2 and launches the stack.
  • 128 Project/Sources/main.c — bare-metal firmware for the K64F (motor PWM, servo steering, UART parsing, AI/manual mode state machine).
  • *.tflite, model/correct — model artifacts used by the vision pipeline.

System architecture

1) Manual mode (Bluetooth)

  1. bluetooth_node.py reads incoming command integers from /dev/rfcomm0.
  2. It publishes each value to ROS topic bluetooth_commands (std_msgs/Int32).
  3. uart_node.py forwards those bytes to /dev/ttymxc2 for the K64F.
  4. K64F firmware interprets command bytes 0..10 and drives motors/servo accordingly.

2) AI mode (vision)

  1. object_detection_node.py captures frames from camera index 3 (640x480).
  2. It runs TFLite inference (attempting libvx_delegate.so NPU delegate).
  3. The largest-confidence-valid object box is published as ai_bboxes (Int32MultiArray: [x, y, w, h]).
  4. uart_node.py serializes that as <BBOX,x,y,w,h> and sends it over UART.
  5. K64F firmware parses BBOX messages and steers left/right/straight based on bbox center, deadband, and hysteresis.

AI mode is entered by sending command 1 (CMD_AI) and exited implicitly when non-AI manual commands are used.

ROS 2 package details (navq_vision)

Nodes

  • object_detection_publisher (object_detection_node.py)

    • Publishes:
      • camera_with_boxes (sensor_msgs/CompressedImage)
      • ai_bboxes (std_msgs/Int32MultiArray)
    • Uses model path hardcoded to /home/user/ros2_ws/model/correct.
  • video_publisher (video_publisher.py)

    • A more general detector/overlay node with parameters for model type (ssd/yolov8), compression, thresholds, and optional depth estimation.
    • Publishes detections and overlay images.
  • bluetooth_publisher (bluetooth_node.py)

    • Reads serial Bluetooth input from /dev/rfcomm0 at 9600 and publishes bluetooth_commands.
  • uart_publisher (uart_node.py)

    • Reads bluetooth_commands and ai_bboxes.
    • Sends control data over /dev/ttymxc2 at 115200.
    • Tracks AI vs manual state locally.
  • camera_publisher (camera_node.py)

    • Basic camera publisher for raw/compressed image experimentation.

ROS command IDs

Name Value
CMD_OFF 0
CMD_AI 1
CMD_STOP 2
CMD_FORWARD_STRAIGHT 3
CMD_FORWARD_RIGHT 4
CMD_FORWARD_LEFT 5
CMD_BACKWARD_STRAIGHT / CMD_REVERSE_STRAIGHT 6
CMD_BACKWARD_RIGHT / CMD_REVERSE_RIGHT 7
CMD_BACKWARD_LEFT / CMD_REVERSE_LEFT 8
CMD_RIGHT 9
CMD_LEFT 10

K64F firmware behavior (128 Project/Sources/main.c)

Hardware mapping (from source comments/config)

  • Right motor direction: PTD0/PTD1, speed PWM: PTD2 (FTM3 CH2)
  • Left motor direction: PTD3/PTD4, speed PWM: PTD5 (FTM0 CH5)
  • Steering servo: PTC10 (FTM3 CH6)
  • UART0: debug (PTB16/PTB17, 9600)
  • UART1: NavQ+ link (PTC3/PTC4, 115200)

Runtime logic

  • Maintains mode state: MANUAL or AI.
  • Manual mode: byte commands directly map to motion primitives.
  • AI mode:
    • Accepts framed ASCII messages: <BBOX,x,y,w,h>.
    • Computes bbox center against frame midpoint (640/2) with deadband.
    • Applies hysteresis before turning direction changes.
    • Commands steering + forward drive at fixed AI speed.
  • PIT safety interrupt stops motors when no command was received in the last interval.

Serial protocol between NavQ+ and K64F

Manual command

Single raw byte:

0..10

AI bounding box command

ASCII framed packet:

<BBOX,x,y,w,h>

Example:

<BBOX,210,120,140,190>

Setup and run

1) Prerequisites

  • Ubuntu + ROS 2 Humble
  • Python deps used in code:
    • rclpy, sensor_msgs, vision_msgs, cv_bridge
    • opencv-python, numpy, pyserial, tflite-runtime
  • Access to serial devices:
    • /dev/rfcomm0 (Bluetooth)
    • /dev/ttymxc2 (K64F UART)
  • TFLite model file at path expected by your node (currently hardcoded in some files).

2) Build ROS 2 package

From workspace root where src/ lives:

colcon build --packages-select navq_vision
source install/setup.bash

3) Launch

Using script:

bash startup.sh

Or direct ROS launch:

ros2 launch navq_vision autobot_launch.py

4) Firmware

  • Open the 128 Project MCU project in your NXP IDE/toolchain.
  • Build and flash to MK64F12 target.

About

Bare Metal Object Avoiding Robot using the Kinetis K64F MCU and NavQPlus

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages