-
Notifications
You must be signed in to change notification settings - Fork 11
Home
The Musical Gestures Toolbox for Python is a collection of high-level modules targeted at researchers working with video recordings. It includes visualization techniques such as motion videos, motion history images, and motiongrams; techniques that, in different ways, allow for looking at video recordings from different temporal and spatial perspectives. It also includes basic computer vision analysis, such as extracting the quantity and centroid of motion, and using such features in analysis.
The toolbox was initially developed to analyze music-related body motion (of musicians, dancers, and perceivers) but is equally helpful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.
We've recently added a complete set of modern examples and enhanced documentation:
- Advanced Video Processing - Complete motion analysis workflow with custom preprocessing
- Audio Analysis - Waveforms, spectrograms, and audio features
- Batch Processing - Process multiple videos efficiently
- Pose Estimation - Human pose tracking and analysis
The Musical Gestures Toolbox contains functions to analyze and visualize video, audio, and motion capture data. There are three categories of functions:
- Preprocessing (trimming, cropping, color adjustments, etc.)
- Visualization (video playback, image display, plotting)
- Processing (videograms, average images, motion images, etc.)
- For Quick Start: Check out our new examples - modern Python scripts you can run immediately
- For Learning: Use the traditional Jupyter Notebook or run it in Colab
- For Reference: Browse the detailed function documentation in this wiki
This wiki provides detailed documentation for individual MGT functions:
- Installation - Setup instructions for all platforms
- Video Basics - Understanding video processing concepts
- Loading Videos - How to load and display videos
- Preprocessing - Video preprocessing techniques
- Video Analysis - Motion analysis and visualization
- Audio Analysis - Audio processing functions
- Output Management - Working with results
- Filtering Effects - Understanding filter parameters
- Function Chaining - Combining multiple operations
- File Naming - Output file conventions
The speed and efficiency of the MGT are made possible by the excellent FFmpeg project. Many of the toolbox functions are Python wrappers/bindings on FFmpeg commands called in a subprocess.
Please help improve the toolbox by adding bugs and feature requests in the issues section.
A project from the fourMs Lab, RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion, Department of Musicology, University of Oslo.