Names: Charles Moreno and Nick Weiss
This project uses a python eye tracking library to generate high-level robot commands, publishes them over MQTT, and visualizes both gaze and commands in a GUI. An optional MQTT “robot game” shows a dot moving on a grid in response to the commands.
- Python 3.9+ installed
- Internet connection (for the public MQTT broker)
python -m venv venv
# Windows
venv\Scripts\activate
# macOS / Linux
source venv/bin/activatepip install -r requirements.txtpython3 main.pyThis will:
Start the gaze-tracking thread (EyeTrax + OpenCV)
Normalize gaze data and compute fixations
Generate high-level robot commands
Publish robot commands to the MQTT topic gaze_bot/command
Open a fullscreen Tkinter GUI showing:
Red dot = live gaze
Text labels = interpreted command + gaze/fixation info
In a second terminal, run:
python3 robot_game_client.pyThis program:
Connects to the public MQTT broker test.mosquitto.org:1883
Listens for robot commands (FORWARD, BACKWARD, LEFT, RIGHT, STOP)
Moves a dot around a grid
Lets you “capture” an orange token to increase score
This demonstrates multi-program communication and robot-control logic without requiring ROS2
main.py # Starts gaze tracking + GUI + command pipeline
blackboard.py # Shared state & observer event hub
gaze_source.py # EyeTrax + webcam gaze reader (threaded)
gaze_interpreter.py # Gaze window --> FixationEvent
command_generator.py # FixationEvent --> RobotCommand
mqtt_command_publisher.py # Publishes RobotCommand via MQTT
gaze_display.py # Tkinter GUI visualizing gaze/fixation/command
robot_game_client.py # Optional MQTT robot visualization mini-game
gaze_event.py # GazeEvent dataclass
fixation_event.py # FixationEvent dataclass
robot_command.py # RobotCommand dataclass + enum