The Musical Gestures Toolbox for Python is a collection of tools for visualization and analysis of audio and video.
The easiest way to get started is to take a look at the Jupyter notebook MusicalGesturesToolbox, which shows examples of the usage of the toolbox.
The standard installation via pip
: paste and execute the following code in the Terminal (OSX, Linux) or the PowerShell (Windows):
pip install musicalgestures
MGT is developed in Python 3 and relies on FFmpeg
and OpenCV
. See the wiki documentation for more details on the installation process.
Watch a 10-minute introduction to the toolbox:
MGT can generate both dynamic and static visualizations of video files, including motion videos, history videos, average images, motiongrams, and videograms. It can also extract various features from video files, including the quantity, centroid, and area of motion. The toolbox also integrates well with other libraries, such as OpenPose for skeleton tracking, and Librosa for audio analysis. All the features are described in the wiki documentation.
This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max.
The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
If you use this toolbox in your research, please cite this article:
- Laczkó, B., & Jensenius, A. R. (2021). Reflections on the Development of the Musical Gestures Toolbox for Python. Proceedings of the Nordic Sound and Music Computing Conference, Copenhagen.
Developers: Balint Laczko, Joachim Poutaraud, Frida Furmyr, Marcus Widmer, Alexander Refsum Jensenius
This toolbox is released under the GNU General Public License 3.0 license.