This project provides a real-time multi-user interaction system that combines computer vision and Open Sound Control (OSC) messaging. It enables multiple participants to interact naturally in an installation, performance, or research setting by detecting body poses and gestures and transmitting them to other applications (such as AAASeed) for visualization.
- Pose detection using YOLOv11-Pose.
- Multi-user support for detecting and tracking multiple people in a camera feed.
- Gesture-based interactions (e.g., hand voting).
- OSC communication for sending keypoints and interaction data to other apps.
- User-friendly GUI built with Tkinter to:
- Select the camera
- Choose between modules (pose sender or hand voting)
- View live annotated video
- Monitor OSC messages and detection stats
multi-user-interaction-main/
│── run_app.py # Main entrypoint for launching the GUI
│── requirements.txt # Python dependencies
│── yolo11n-pose.pt # YOLOv11-Pose model weights
│
└── gui_app/
├── multiuser_gui.py # Tkinter GUI logic
├── osc_sender_yolov11.py # Pose detection + OSC sender
├── hand_voting_yolov11.py # Hand voting interaction module
└── yolo11n-pose.pt # Local copy of model weights
-
Clone this repository or download the ZIP and extract it:
git clone https://github.com/johndim21/multi-user-interaction.git cd multi-user-interaction -
Create and activate a virtual environment (recommended):
python -m venv venv source venv/bin/activate # On macOS/Linux venv\Scripts\activate # On Windows
-
Install dependencies:
pip install -r requirements.txt
-
Make sure you have a working camera (webcam or IP camera).
Start the application with:
python run_app.py-
OSC Sender (Pose Detection):
Detects multiple users' body poses and sends keypoints over OSC. -
Hand Voting Module:
Recognizes simple hand gestures for voting/interaction and sends results via OSC. -
Camera Selection:
Choose the camera source directly from the GUI.
The live video window will show detected users with keypoints and annotated gestures, while the right panel displays detection statistics and sent OSC messages.
- The module sends data via the OSC protocol.\
- Default OSC settings can be configured in the code
(
osc_sender_yolov11.pyorhand_voting_yolov11.py).\ - Works seamlessly with AAASeed for real-time visualization.
- Python 3.9+
- Webcam or IP camera
- Dependencies listed in
requirements.txt(YOLO, OpenCV, Tkinter, python-osc, etc.)
- Interactive art installations
- Multi-user performances
- Live exhibitions and participatory experiences
- Research on natural multi-user interactions
MIT License. See LICENSE for details.
Developed as part of the Artcast4D Project (Work Package 2, Task 2.4
"Natural multi-user interaction").
This project has received funding from the European Union’s Horizon Europe research and innovation programme under grant agreement No 101061163.
Powered by Ultralytics
YOLO.