This Unity project creates a virtual reality environment for Human-Robot Collaboration (HRC), featuring a real-time, closed-loop feedback system designed to monitor a practitioner's ergonomic and cognitive state and adapt the robot's behavior accordingly.
The simulation places a user in a virtual workcell with a KUKA KR210 robot. The core of the project is an adaptive feedback system that enhances safety and well-being. The practitioner's movements and gaze are captured within Unity and sent via an MQTT broker to an external backend for analysis. This backend assesses risks like poor posture or inattention and sends alerts back to the Unity application. The simulation then provides real-time feedback to the user (visual, audio, and character highlighting) while simultaneously commanding the robot to slow down or halt, creating a safer and more intuitive collaborative workspace.
- VR-Based Robot Workcell: Simulates a KUKA KR210 robot performing multi-step pick-and-place tasks.
- Real-time Human Monitoring:
- Captures and publishes detailed joint angles from a motion capture avatar via MQTT.
- Tracks the practitioner's gaze direction and attention state using the HP Omnicept SDK.
- Adaptive Robot Behavior:
- Receives real-time alerts from an external analysis system via MQTT.
- Dynamically controls the robot's speed or halts its operation based on the severity of posture or inattention alerts.
- Rich Practitioner Feedback:
- UI Alerts: Displays clear text messages for system status, warnings, and critical alerts.
- 3D Visual Feedback: Highlights specific joints on the practitioner's avatar in real-time (Yellow for warning, Red for critical) to guide posture correction.
- Contextual Gaze Guidance: An on-screen UI element appears to guide the user's attention to the robot's tip when a posture alert occurs and they are not looking.
- Interactive VR Tasks:
- The user performs tasks at destination points using VR controllers, featuring a progress bar UI for completion.
- Interaction is gated by proximity, requiring the user to be within a set distance of the task area, with color-change feedback on the target object.
The project operates as part of a larger, three-part system:
- Unity VR Application (This Project): Acts as the primary user interface and data gateway. It visualizes the HRC environment, captures practitioner data (motion, gaze), sends it to the broker, receives feedback alerts, and renders all feedback to the user and robot.
- MQTT Broker (External): A central messaging server that facilitates communication between the Unity application and the backend analysis system. All data and commands are routed through it.
- Backend Analysis System (External): A separate service performs real-time analysis to detect ergonomic or cognitive risks and publishes structured JSON alerts back to the broker for the Unity application to consume.
- The practitioner performs a task in the VR environment.
- The
MQTTConnection.csscript reads joint angles from the variousValue...csscripts.GazeAttentionChecker.csdetermines the user's attention state. - This raw data is published to the MQTT Broker.
- The external backend service consumes the data and detects a
slidingWindowPostureAlert(e.g., "RED" severity for the neck). - The backend publishes a JSON alert to the
practitioner.feedback/unitytopic. - The
MQTTHandler.csin Unity receives the message and passes it toFeedbackIntegrationManager.cs. FeedbackIntegrationManager.csparses the alert and triggers a multi-faceted response:- It sends a
haltcommand to therobot.feedback/unitytopic, which is received byKr210TrajectoryPlanner.csto stop the robot. - It calls
CharacterFeedbackVisualizer.csto highlight the practitioner's neck joint in RED. - It calls
AlertDisplay.csto show a "Posture CRITICAL" message in the UI. - It plays a critical audio cue via its
AudioSource.
- It sends a
- The system remains in this halted state until a
"clear"command is received.
Kr210TrajectoryPlanner.cs: The main controller for the robot's actions, trajectory execution, and the human-in-the-loop task sequence.FeedbackIntegrationManager.cs: The brain of the feedback system. It receives all alerts, maintains the practitioner's current state (e.g., posture warnings, attention level), and orchestrates all feedback responses to the user and the robot.MQTTHandler.cs: A robust, general-purpose script for managing the connection and communication with the MQTT broker.CharacterFeedbackVisualizer.cs: Manages the real-time coloring of proxy objects on the character model to provide visual feedback for specific joint alerts.GazeAttentionChecker.cs: Interfaces with the HP Omnicept SDK to track user gaze, publish attention data, and control the gaze guidance UI.TaskCompletionUI.cs: A component placed on destination objects that manages the progress bar for interactive tasks.TaskUIInteractor.cs: A component on the VR controller that detects pointing and trigger presses to interact withTaskCompletionUI.