Skip to content

FHSAF/A-HRCnVR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adaptive Human-Robot Collaboration VR Simulation

This Unity project creates a virtual reality environment for Human-Robot Collaboration (HRC), featuring a real-time, closed-loop feedback system designed to monitor a practitioner's ergonomic and cognitive state and adapt the robot's behavior accordingly.

Overview

The simulation places a user in a virtual workcell with a KUKA KR210 robot. The core of the project is an adaptive feedback system that enhances safety and well-being. The practitioner's movements and gaze are captured within Unity and sent via an MQTT broker to an external backend for analysis. This backend assesses risks like poor posture or inattention and sends alerts back to the Unity application. The simulation then provides real-time feedback to the user (visual, audio, and character highlighting) while simultaneously commanding the robot to slow down or halt, creating a safer and more intuitive collaborative workspace.

Core Features

  • VR-Based Robot Workcell: Simulates a KUKA KR210 robot performing multi-step pick-and-place tasks.
  • Real-time Human Monitoring:
    • Captures and publishes detailed joint angles from a motion capture avatar via MQTT.
    • Tracks the practitioner's gaze direction and attention state using the HP Omnicept SDK.
  • Adaptive Robot Behavior:
    • Receives real-time alerts from an external analysis system via MQTT.
    • Dynamically controls the robot's speed or halts its operation based on the severity of posture or inattention alerts.
  • Rich Practitioner Feedback:
    • UI Alerts: Displays clear text messages for system status, warnings, and critical alerts.
    • 3D Visual Feedback: Highlights specific joints on the practitioner's avatar in real-time (Yellow for warning, Red for critical) to guide posture correction.
    • Contextual Gaze Guidance: An on-screen UI element appears to guide the user's attention to the robot's tip when a posture alert occurs and they are not looking.
  • Interactive VR Tasks:
    • The user performs tasks at destination points using VR controllers, featuring a progress bar UI for completion.
    • Interaction is gated by proximity, requiring the user to be within a set distance of the task area, with color-change feedback on the target object.

System Architecture

The project operates as part of a larger, three-part system:

  1. Unity VR Application (This Project): Acts as the primary user interface and data gateway. It visualizes the HRC environment, captures practitioner data (motion, gaze), sends it to the broker, receives feedback alerts, and renders all feedback to the user and robot.
  2. MQTT Broker (External): A central messaging server that facilitates communication between the Unity application and the backend analysis system. All data and commands are routed through it.
  3. Backend Analysis System (External): A separate service performs real-time analysis to detect ergonomic or cognitive risks and publishes structured JSON alerts back to the broker for the Unity application to consume.

How the Feedback Loop Works

  1. The practitioner performs a task in the VR environment.
  2. The MQTTConnection.cs script reads joint angles from the various Value...cs scripts. GazeAttentionChecker.cs determines the user's attention state.
  3. This raw data is published to the MQTT Broker.
  4. The external backend service consumes the data and detects a slidingWindowPostureAlert (e.g., "RED" severity for the neck).
  5. The backend publishes a JSON alert to the practitioner.feedback/unity topic.
  6. The MQTTHandler.cs in Unity receives the message and passes it to FeedbackIntegrationManager.cs.
  7. FeedbackIntegrationManager.cs parses the alert and triggers a multi-faceted response:
    • It sends a halt command to the robot.feedback/unity topic, which is received by Kr210TrajectoryPlanner.cs to stop the robot.
    • It calls CharacterFeedbackVisualizer.cs to highlight the practitioner's neck joint in RED.
    • It calls AlertDisplay.cs to show a "Posture CRITICAL" message in the UI.
    • It plays a critical audio cue via its AudioSource.
  8. The system remains in this halted state until a "clear" command is received.

Key Scripts

  • Kr210TrajectoryPlanner.cs: The main controller for the robot's actions, trajectory execution, and the human-in-the-loop task sequence.
  • FeedbackIntegrationManager.cs: The brain of the feedback system. It receives all alerts, maintains the practitioner's current state (e.g., posture warnings, attention level), and orchestrates all feedback responses to the user and the robot.
  • MQTTHandler.cs: A robust, general-purpose script for managing the connection and communication with the MQTT broker.
  • CharacterFeedbackVisualizer.cs: Manages the real-time coloring of proxy objects on the character model to provide visual feedback for specific joint alerts.
  • GazeAttentionChecker.cs: Interfaces with the HP Omnicept SDK to track user gaze, publish attention data, and control the gaze guidance UI.
  • TaskCompletionUI.cs: A component placed on destination objects that manages the progress bar for interactive tasks.
  • TaskUIInteractor.cs: A component on the VR controller that detects pointing and trigger presses to interact with TaskCompletionUI.

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors