Webpages tagged with «max»

Published July 30, 2013 09:05 PM

Alexander Refsum Jensenius will present a poster of the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm today.

Published Apr. 6, 2013 08:50 PM

Use any standard game controller (or other Human Interface Device) as a multipurpose music controller.

Published Aug. 25, 2012 11:44 AM

Jamoma is an open-source project for structured programming in Max/MSP/Jitter and is based on modular principles that allow the reuse of functionalities whereas all parameters remain customizable to specific needs. We are happy to announce that Jamoma 0.5 is now released. Post.doc. Alexander Refsum Jensenius and research fellow Kristian Nymoen have contributed to Jamoma over the last years, and FourMs hosted a Jamoma developer workshop in the fall of 2008.

Published Aug. 25, 2012 11:44 AM

Master student Boye Riis jr. defended his MA thesis project in musicology today. The project has resulted in development of several digital music instruments, and a dissertation entitled "eBoy". [PDF]

 

Published Aug. 22, 2012 04:01 PM

The Jamoma GDIF tools are developed for recording and playing back GDIF files based on the SDIF format.

The tools work in Max 5, and requires FTM (latest version), Jamoma and OpenSoundControl to run.   

Published Aug. 22, 2012 04:01 PM

Use any standard game controller (or other Human Interface Device) as a multipurpose music controller.

Published Aug. 22, 2012 04:01 PM

This is an application for realtime analysis of audio and video. It draws a spectrogram from any connected microphone and motiongram/videogram from any connected camera.

Published Aug. 22, 2012 04:01 PM

The Musical Gestures Toolbox is a collection of modules and abstractions developed in and for the graphical programming environment Max 5 . The toolbox is currently being developed within the Jamoma open platform for interactive art-based research and performance. The toolbox is probably most useful for people that already are experienced Max programmers. People looking for similar functionality should check out some of the standalone applications we have built based on the toolbox.

Jamoma download page

Published Aug. 22, 2012 04:01 PM

Non-realtime video analysis, exporting motiongrams and various quantitative features of movement in the video file.

Published Aug. 7, 2012 10:22 AM

On the Development of an Auditory Virtual Environment for Musical Applications

In the last few decades, the development of audio reproduction and spatialization techniques greatly benefits from composers whose pioneering work still inspires researchers to refine spatial audio systems (e.g. Stockhausen, Chowning, Boulez). However, novel spatialization tools developed by engineers and researchers hardly find their way from the developers' labs into the composition studios. To make future developments more applicable, researchers have to understand this current lack of coherence between development and artistic use.

In this talk, first, results of a quantitative study are presented and shows how composers use spatialization, what spatial aspects are essential and what functionalities spatial audio systems should strive to include or improve. Secondly, ViMiC (Virtual Microphone Control), a novel spatial rendering software is presented. ViMiC provides a computer-generated virtual environment for the purpose of creating spatial sounds scenes. Apart from positioning sound sources, other spatial aspects, such as source width, distance, and room impression, can be created in real-time, particularly for concert situations and site-specific immersive installations.

Published July 23, 2012 03:29 PM

We are happy to announce a public beta of our GDIF recording and playback modules for Max 5. They are developed as Jamoma modules, using the FTM library from IRCAM for writing and reading of files. The files can checked out from the Jamoma UserLib, or downloaded here.

Published July 23, 2012 03:29 PM

Master student Boye Riis jr. defended his MA thesis project in musicology today. The project has resulted in development of several digital music instruments, and a dissertation entitled "eBoy". [PDF]

 

Published July 23, 2012 03:29 PM

Alexander Refsum Jensenius recently presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain.

Abstract: The paper presents a method for sonification of human body motion based on motiongrams. Motiongrams show the spatiotemporal development of body motion by plotting average matrices of motion images over time. The resultant visual representation resembles spectrograms, and is treated as such by the new sonifyer module for Jamoma for Max, which turns motiongrams into sound by reading a part of the matrix and passing it on to an oscillator bank. The method is surprisingly simple, and has proven to be useful for analytical applications and in interactive music systems.

See below for the full paper and video examples.

Published July 13, 2012 12:51 PM

New fourMs publications at the Sound and Music Computing conference in Copenhagen:

Published July 1, 2010 04:52 PM

The Jamoma GDIF tools are developed for recording and playing back GDIF files based on the SDIF format.

The tools work in Max 5, and requires FTM (latest version), Jamoma and OpenSoundControl to run.   

Published Mar. 8, 2010 10:17 AM

The Musical Gestures Toolbox is a collection of modules and abstractions developed in and for the graphical programming environment Max 5 . The toolbox is currently being developed within the Jamoma open platform for interactive art-based research and performance. The toolbox is probably most useful for people that already are experienced Max programmers. People looking for similar functionality should check out some of the standalone applications we have built based on the toolbox.

Jamoma download page

Published Feb. 22, 2010 12:35 PM

Use any standard game controller (or other Human Interface Device) as a multipurpose music controller.

Published Feb. 22, 2010 12:26 PM

Non-realtime video analysis, exporting motiongrams and various quantitative features of movement in the video file.

Published Feb. 22, 2010 12:01 PM

This is an application for realtime analysis of audio and video. It draws a spectrogram from any connected microphone and motiongram/videogram from any connected camera.