Musical Gestures Toolbox for Python

The Musical Gestures Toolbox for Python is a collection of tools for video visualization and video analysis.

Musical Gestures Toolbox for Python running Jupyter Notebook

Musical Gestures Toolbox for Python

Videos can be used to develop new visualisations to be used for analysis. The aim of creating such alternate displays from video recordings is to uncover features, structures and similarities within the material itself, and in relation to, for example, score material.

About MGT for Python

The Musical Gestures Toolbox for Python includes video visualization techniques such as creating motion videos, motion history images, and motiongrams. These visualizations allow for studying video recordings from different temporal and spatial perspectives. The toolbox also includes basic computer vision methods, and it is designed to integrate well with audio analysis toolboxes.

Usage

It is possible to run the toolbox from the terminal:

ipython example
Example of running MGT for Python in a terminal.

Many people would probably prefer to run it in a Jupyter notebook:

Screenshots from the example Jupyter Notebook.

The MGT was initially developed to analyze music-related body motion (of musicians, dancers, and perceivers) but is equally helpful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.

Features

MGT can generate both dynamic and static visualizations, as well as some quantitative data:

  • dynamic visualisations (video files)
    • motion video
    • motion history video
  • static visualisations (images)
    • motion average image
    • motiongrams
    • videograms
  • motion data (csv files)
    • quantity of motion
    • centroid of motion
    • area of motion

History

This toolbox builds on the Musical Gestures Toolbox for Matlab, which again builds on the Musical Gestures Toolbox for Max. The latest version was primarily developed by Bálint Laczkó, Frida Furmyr, and Marcus Widmer.

The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.

Citation

If you use the toolbox in your research, please cite:

Published June 25, 2019 4:24 PM - Last modified Nov. 13, 2021 10:16 PM