Musical Gestures Toolbox

The Musical Gestures Toolbox (MGT) is a collection of tools for creating visualizations of video files. It was primarily developed for music researchers, but is also used for sports, dance, healthcare, architecture, and interaction design.



If you just want a simple solution for creating video visualizations, check out VideoAnalysis. This is a standalone app for OSX and Windows. The three toolboxes below all require that you know some programming (Max, Matlab, or Python).

Musical Gestures Toolbox for Max

The original MGT was a collection of modules for Max/MSP/Jitter. The toolbox is currently distributed as part of the Jamoma package, and can run both realtime and non-realtime.

Musical Gestures Toolbox for Matlab

The Matlab port of MGT contains a large collection of video visualization functions, batch processing, and server processing. It also integrates with the MIR Toolbox and MoCap Toolbox.

Musical Gestures Toolbox for Python

The Python port of MGT is the latest edition, and the one in most active development. Most of the functionality of the Matlab version has been ported. It can be run from Jupyter Notebook. 


The Musical Gestures Toolbox has been developed by Alexander Refsum Jensenius since 2004. It started out as a Max patch, and has later been ported and further developed in both Matlab and Python. A number of students and colleagues have contributed to various parts of the code over the years.

The software is currently maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.


This software is open source, and is shared with The GNU General Public License v3.0.