Alexander Refsum Jensenius recently presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain.
Abstract: The paper presents a method for sonification of human body motion based on motiongrams. Motiongrams show the spatiotemporal development of body motion by plotting average matrices of motion images over time. The resultant visual representation resembles spectrograms, and is treated as such by the new sonifyer module for Jamoma for Max, which turns motiongrams into sound by reading a part of the matrix and passing it on to an oscillator bank. The method is surprisingly simple, and has proven to be useful for analytical applications and in interactive music systems.
See below for the full paper and video examples.
We are very happy to announce that the fourMs proposal to become a Norwegian Centre of Excellence has been selected as a finalist. The second round will be decided on in the autumn.
fourMs researchers are presenting 4 papers at this year's NIME conference in Ann Arbor, Michigan. See below for details:
We believe in the importance of passing on the interest for music technology to the coming generations. Kristian visited an elementary school to do so last month.
The pupils formed a laptop orchestra, and rehearsed the piece Clix by Ge Wang. Furthermore they explored sound interaction with the musical gestures toolbox and sensor interfaces like the "music snake".