Webpages tagged with «publication»

Published July 30, 2013 9:05 PM

Alexander Refsum Jensenius will present a poster of the paper Non-Realtime Sonification of Motiongrams at the Sound and Music Computing Conference (SMC) in Stockholm today.

Published June 3, 2013 2:31 PM

The paper "Analyzing correspondence between sound objects and body motion" by Kristian Nymoen, Rolf Inge Godøy, Alexander Refsum Jensenius and Jim Tørresen has now been published in ACM Transactions on Applied Perception.

Published Jan. 14, 2013 8:27 PM

Senior research fellow Alexander Refsum Jensenius has just published a paper entitled “Some video abstraction techniques for displaying body movement in analysis and performance” in the MIT Press journal Leonardo.

Published Jan. 8, 2013 4:41 PM

Senior research fellow Alexander Refsum Jensenius has just published a paper in Computer Music Journal together with Victoria Johnson. The paper is based on the experience of developing the piece Transformation for electric violin and live electronics (see video of the piece below).

Published Aug. 25, 2012 11:44 AM

We are happy to announce the publication of our new book: 

Rolf Inge Godøy and Marc Leman (2009). Musical Gestures: Sound, Movement, and Meaning. New York: Routledge.

Summary: We experience and understand the world, including music, through body movement–when we hear something, we are able to make sense of it by relating it to our body movements, or form an image in our minds of body movements. Musical Gestures is a collection of essays that explore the relationship between sound and movement. It takes an interdisciplinary approach to the fundamental issues of this subject, drawing on ideas, theories and methods from disciplines such as musicology, music perception, human movement science, cognitive psychology, and computer science.

Published Aug. 25, 2012 11:44 AM

Postdoctoral researcher Alexander Refsum Jensenius just published the book “Musikk og bevegelse” (Music and movement). This is a text book in Norwegian giving an overview of theory and methods used in the study of music-related movements.

Published Aug. 25, 2012 11:44 AM

Professor Rolf Inge Godøy is at the 10th International Society for Music Information Retrieval Conference in Kobe, Japan, presenting the paper: 

Godøy, R. I. and Jensenius, A. R. (2009). Body movement in music information retrieval. In Proceedings of the 10th International Society for Music Information Retrieval Conference, Kobe, Japan. [PDF]

Published Aug. 21, 2012 3:05 PM

We are happy to announce the publication of a new journal article in collaboration with our colleagues in Trondheim:   

Adde L, Helbostad JL, Jensenius AR, Taraldsen G, Grunewaldt KH, Støen R. 2010. Early prediction of cerebral palsy by computer-based video analysis of general movements: a feasibility study. Developmental Medicine & Child Neurology.

Published Aug. 21, 2012 3:05 PM

We have a paper entitled "fourMs, University of Oslo – Lab Report" in the Proceedings of the 2010 International Computer Music Conference, New York, NY, 1–5 June 2010. [PDF]

Authors: Jensenius, A.R., Glette, K., Godøy, R.I., Høvin, M., Nymoen, K., Skogstad, S.A., Tørresen, J. (2010)

Abstract: The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Ma- chines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the fourMs lab is centred around studies of basic issues in music cogni- tion, machine learning and robotics.

Published Aug. 21, 2012 3:05 PM

We are happy to announce that fourMs researchrs are involved in no less than 5 papers at the upcoming NIME 2010 conference in Sydney, Australia. Please see below for details.

Published Aug. 21, 2012 3:05 PM

Publication: Godøy, R. I., Jensenius, A. R., and Nymoen, K. (2010). Chunking in music by coarticulation. Acta Acoustica united with Acoustica, 96(4):690–700. [Fulltext]

Abstract: In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people's minds when they perceive or imagine music. Chunks are here understood as holistically conceived and perceived fragments of action and sound, typically with durations in the 0.5 to 5 seconds range. There is also evidence suggesting the occurrence of coarticulation within these chunks, meaning the fusion of small-scale actions and sounds into more superordinate actions and sounds. Various aspects of chunking and coarticulation are discussed in view of their role in the production and perception of music, and it is suggested that coarticulation is an integral element of music and should be more extensively explored in the future.

Published July 23, 2012 3:29 PM

We are happy to announce the publication of our new book:

Rolf Inge Godøy and Marc Leman (2009). Musical Gestures: Sound, Movement, and Meaning. New York: Routledge.

Published July 23, 2012 3:29 PM

Postdoctoral researcher Alexander Refsum Jensenius just published the book “Musikk og bevegelse” (Music and movement). This is a text book in Norwegian giving an overview of theory and methods used in the study of music-related movements.

Published July 23, 2012 3:29 PM

We are happy to announce the publication of a new journal article in collaboration with our colleagues in Trondheim:   

Adde L, Helbostad JL, Jensenius AR, Taraldsen G, Grunewaldt KH, Støen R. 2010. Early prediction of cerebral palsy by computer-based video analysis of general movements: a feasibility study. Developmental Medicine & Child Neurology.

Published July 23, 2012 3:29 PM

Publication: Godøy, R. I., Jensenius, A. R., and Nymoen, K. (2010). Chunking in music by coarticulation. Acta Acoustica united with Acoustica, 96(4):690–700. [Fulltext]

Abstract: In our own and other research on music-related actions, findings suggest that perceived action and sound are broken down into a series of chunks in people's minds when they perceive or imagine music. Chunks are here understood as holistically conceived and perceived fragments of action and sound, typically with durations in the 0.5 to 5 seconds range. There is also evidence suggesting the occurrence of coarticulation within these chunks, meaning the fusion of small-scale actions and sounds into more superordinate actions and sounds. Various aspects of chunking and coarticulation are discussed in view of their role in the production and perception of music, and it is suggested that coarticulation is an integral element of music and should be more extensively explored in the future.

Published July 23, 2012 3:29 PM

Visiting researcher Yago de Quay and fourMs PhD student Ståle Skogstad and Postdoc Alexander Refsum Jensenius have published a paper in Leonardo Music Journal:

ABSTRACT: The authors present an experimental musical performance called Dance Jockey, wherein sounds are controlled by sensors on the dancer's body. These sensors manipulate music in real time by acquiring data about body actions and transmitting the information to a control unit that makes decisions and gives instructions to audio software. The system triggers a broad range of music events and maps them to sound effects and musical parameters such as pitch, loudness and rhythm.

Published July 23, 2012 3:29 PM

We have a paper entitled "fourMs, University of Oslo – Lab Report" in the Proceedings of the 2010 International Computer Music Conference, New York, NY, 1–5 June 2010. [PDF]

Authors: Jensenius, A.R., Glette, K., Godøy, R.I., Høvin, M., Nymoen, K., Skogstad, S.A., Tørresen, J. (2010)

Abstract: The paper reports on the development and activities in the recently established fourMs lab (Music, Mind, Motion, Ma- chines) at the University of Oslo, Norway. As a meeting place for researchers in music and informatics, the fourMs lab is centred around studies of basic issues in music cogni- tion, machine learning and robotics.

Published July 23, 2012 3:29 PM

Professor Rolf Inge Godøy is at the 10th International Society for Music Information Retrieval Conference in Kobe, Japan, presenting the paper:

Godøy, R. I. and Jensenius, A. R. (2009). Body movement in music information retrieval. In Proceedings of the 10th International Society for Music Information Retrieval Conference, Kobe, Japan. [PDF]

Published July 23, 2012 3:29 PM

Alexander Refsum Jensenius recently presented the paper “Motion-sound Interaction Using Sonification based on Motiongrams” at the ACHI 2012 conference in Valencia, Spain.

Abstract: The paper presents a method for sonification of human body motion based on motiongrams. Motiongrams show the spatiotemporal development of body motion by plotting average matrices of motion images over time. The resultant visual representation resembles spectrograms, and is treated as such by the new sonifyer module for Jamoma for Max, which turns motiongrams into sound by reading a part of the matrix and passing it on to an oscillator bank. The method is surprisingly simple, and has proven to be useful for analytical applications and in interactive music systems.

See below for the full paper and video examples.

Published July 23, 2012 3:29 PM

Rolf Inge Godøy has published the chapter "Sound-action chunks in music in the new Springer-Verlag book Musical Robots and Interactive Multimodal Systems edited by Kia Ng og Jorge Solis.

Published July 23, 2012 3:29 PM

We are happy to announce that fourMs researchrs are involved in no less than 5 papers at the upcoming NIME 2010 conference in Sydney, Australia. Please see below for details.