The Musical Gestures Toolbox (MGT) is a collection of tools for creating visualizations of video files. It was primarily developed for music researchers, but is also used for sports, dance, healthcare, architecture, and interaction design.
The Musical Gestures Toolbox has been developed by Alexander Refsum Jensenius since 2004. It started out as a patcher for the graphical multimedia programming environment Max/MSP/Jitter, and was quickly merged into modules and components in the Jamoma project. The modules have been used in a number of music and dance performances over the years.
When the course MUS2006 Music and Body Movements start up at University of Oslo in 2009, it was necessary to provide the non-programming students with an easy solution for creating video visualizations. The result was the standalone app VideoAnalysis, which saw a big overhaul in 2020.
For scientific usage, functions from the original toolbox were ported to MGT for Matlab in 2015 and MGT for Python in 2019. With the need to run some code on servers, there is now also a more limited version of MGT for Terminal. All of these scripting-based toolboxes are slightly different in implementation, but they share the vision of creating powerful, yet fairly simple to use, operations for video visualization.
The software is maintained by the fourMs lab at RITMO Centre for Interdisciplinary Studies in Rhythm, Time and Motion at the University of Oslo.
- Jensenius, A. R. (2018). Methods for studying music-related body motion. In R. Bader (Ed.), Handbook of Systematic Musicology (pp. 567–580). Springer-Verlag.
- Jensenius, A. R. (2018). The Musical Gestures Toolbox for Matlab. Proceedings of the International Society for Music Information Retrieval, Late-Breaking Demos.
- Jensenius, A. R. (2014). From experimental music technology to clinical tool. In K. Stensæth (Ed.), Music, health, technology, and design. Norwegian Academy of Music.
- Jensenius, A. R. (2013). Some video abstraction techniques for displaying body movement in analysis and performance. Leonardo, 46(1), 53–60.
- Jensenius, A. R. (2013). Non-Realtime Sonification of Motiongrams. Proceedings of Sound and Music Computing, 500–505.
- Jensenius, A. R., & Godøy, R. I. (2013). Sonifying the shape of human body motion using motiongrams. Empirical Musicology Review, 8(2), 73–83.
- Jensenius, A. R. (2012). Evaluating How Different Video Features Influence the Visual Quality of Resultant Motiongrams. Proceedings of the Sound and Music Computing Conference, 467–472.
- Jensenius, A. R. (2007). Action–Sound: Developing Methods and Tools to Study Music-Related Body Movement [PhD thesis, University of Oslo].
- Jensenius, A. R. (2006). Using motiongrams in the study of musical gestures. Proceedings of the International Conference on New Interfaces for Musical Expression, 499–502.