Forskningsveien 3A (map)
fourMs-researchers wil perform at the VERDIKT conference:
iPhone ensemble playing Bloom and Scrambler (for iPhone and small speakers). Featuring Alexander Refsum Jensenius, Kristian Nymoen, Anders Tveit, Arve Voldsund and Viet Phi Uy Hoang.
Dance Jockey by Yago de Quay and Ståle Skogstad (using Xsens inertial motion capture)
fourMs-researchers wil participate in the Department of Musicology's semester opening concert.
Kristian Nymoen, Anders Tveit, Alexander Refsum Jensenius: Bloom and Scrambler (for iPhone and small speakers)
Yago de Quay, Ståle Skogstad: Posture (with Xsens motion capture)
As preparations for the Motion Capture Workshop next week, we will be giving an introduction to using the NaturalPoint Optitrack system that is currently set up in the fourMs Lab.
Recorded data and various analysis of this session can be downloaded by right clicking on the links
In the Dance Jockey project, we have used a full body inertial motion capture system, the Xsens MVN suit, for musical interaction. The name Dance Jockey is a word play on the well-known term Disc Jockey, or DJ. This name is meant to reflect that instead of using discs to perform music, we use dance or full body motion as the basis for the performance.
Research fellow Kristian Nymoen will defend his dissertation on Friday 25 January 2013.
The MoCap Synthesiser is a set of generic tools for making real time motion tracking devices into musical instruments. One feature extraction module and two sound modules are included.
Postdoctoral researcher Kyrre Glette participated in (and won!) the 64kB intro competition at the Assembly computer festival in Helsinki. A 64kB intro is an executable program in 64kB which includes realtime generation of graphics and music.
The animation includes a dancing robot, where the motion is based on data recorded with our new Qualisys infrared motion capture system.
Graphics programming done by Kim Kalland, Thomas Kristensen and Kyrre Glette. Sound programming and music by Gergely Szelei-Kis.
fourMs-researchers were heavily present at the annual VERDIKT conference yesterday. In addition to a lecture and poster presentations, we also contributed a performance with the iPhone ensemble and a motion capture performance. More info below.
Abstract: In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people's minds when they perceive or imagine music. Chunks are here understood as holistically conceived and perceived fragments of action and sound, typically with durations in the 0.5 to 5 seconds range. There is also evidence suggesting the occurrence of coarticulation within these chunks, meaning the fusion of small-scale actions and sounds into more superordinate actions and sounds. Various aspects of chunking and coarticulation are discussed in view of their role in the production and perception of music, and it is suggested that coarticulation is an integral element of music and should be more extensively explored in the future.
Cynthia M. Grund, Network Coordinator for NNIMIPA, has posted a page with pictures and videos of the motion capture session done with American pianist William Westney during his visit to Oslo in February. There are also links to a video recording of a small discussion between Cynthia Grundt, William Westney and Alexander Refsum Jensenius on some of the topics discussed during the NNIMIPA workshop in Oslo.
Our new motion capture system is presented in the Qualisys newsletter from May. We have been working with Qualisys to create an integrated solution for handling recording and streaming of music-related body movement data, and look forward to working with the new system in the coming years!