MICRO - Menneskelig mikrobevegelse i musikkpersepsjon og interaksjon
Hvordan og hvorfor beveger vi oss til musikk? Prosjektet vil undersøke hvordan musikk påvirker det vi kan kalle mikrobevegelser, slik som de ørsmå bevegelsene vi ser når vi står stille.
Du klarer faktisk ikke å stå stille!
Å ikke bevege seg til dansemusikk, er så godt som umulig, viser ny forskning. Les nyhetssak om MICRO-prosjektet.
Dansefoten er et velkjent fenomen, men først nå har forskere vist at folk faktisk begynner å bevege seg til musikk, selv når de forsøker å stå stille. Foto: Brooke Cagle/Unsplash.
Om prosjektet
Hvordan og hvorfor beveger vi oss til musikk? Dette har vært et viktig spørsmål i musikkvitenskapen og musikkpsykologien de siste årene. Mesteparten av forskningen har så langt fokusert på relativt store kroppsbevegelser, slik som dansing. Dette prosjektet vil undersøke hvordan musikk påvirker det vi kan kalle mikrobevegelser, slik som de ørsmå bevegelsene vi ser når vi står stille. Selv om slike mikrobevegelser knapt er synlige, kan de observeres i en bevegelseslab. Dette gjør det mulig å gjennomføre studier for å se på effekten av musikk på mikrobevegelser.
NRK Dagsrevyen hadde innslag om MICRO-prosjektet i juli 2020:
Prosjektet vil lede til:
kunnskap om hvordan musikk påvirker menneskelig bevegelse på et mikronivå
en stor, åpen database med opptak av mikrobevegelser
programvare for å teste ut bruk av mikrobevegelser i interaktive musikksystemer
Prosjektet bygger på den nyeste forskningen i musikkvitenskap, psykologi og nevrovitenskap. Mesteparten av forskningen foregikk i bevegelseslab'en ved Institutt for musikkvitenskap.
Åpen forskning
MICRO forsøker å være et flaggskipsprosjekt for åpen forskning. Målet er å gjøre så mye som mulig åpent tilgjengelig, men så lukket som nødvendig. Dette inkluderer åpen deling av publikasjoner, data, kildekode, søknaden, og andre deler av forskningsprosessen.
Jensenius, Alexander Refsum & Erdem, Cagri
(2022).
Gestures in ensemble performance.
I Timmers, Renee; Bailes, Freya & Daffern, Helena (Red.),
Together in Music: Coordination, expression, participation.
Oxford University Press.
ISSN 9780198860761.Fulltekst i vitenarkivVis sammendrag
Recent years have seen a rise in interest, from a diversity of fields, in the musical ensemble as an exemplary form of creative group behavior. Musical ensembles can be understood and investigated as high functioning small group organizations that have coordinative structures in place to perform under pressure within strict temporal boundaries. Rehearsals and performances exemplify fruitful contexts for emergent creative behaviour, where novel musical interpretations are negotiated and discovered through improvisatory interaction. Furthermore, group music-making can be an emotionally and socially rewarding experience that enables positive outcomes for wellbeing and development.
This book brings together these different perspectives into one coherent volume, offering insight into the musical ensemble from different analytical levels. Part 1 starts from the meso-level, considering ensembles as creative teams and investigating how musical groups interact at a social and organizational level. Part 2 then zooms in to consider musical coordination and interaction at a micro-level, when considering group music-making as forms of joint action. Finally, a macro-level perspective is taken in Part 3, examining the health and wellbeing affordances associated with acoustical, expressive, and emotional joint behavior. Each part contains a balance of review chapters showcasing the most recent developments in each area of research, followed by demonstrative case studies featuring various ensemble practices and processes.
A rich and multidisciplinary reflection on ensemble music practice, this volume will be an insightful read for music students, teachers, academics, and professionals with an interest in the dynamics of group behavior within a musical context.
Jensenius, Alexander Refsum
(2021).
Best versus Good Enough Practices for Open Music Research.
Empirical Musicology Review.
ISSN 1559-5749.
16(1).
Fulltekst i vitenarkivVis sammendrag
Music researchers work with increasingly large and complex data sets. There are few established data handling practices in the field and several conceptual, technological, and practical challenges. Furthermore, many music researchers are not equipped for (or interested in) the craft of data storage, curation, and archiving. This paper discusses some of the particular challenges that empirical music researchers face when working towards Open Research practices: handling (1) (multi)media files, (2) privacy, and (3) copyright issues. These are exemplified through MusicLab, an event series focused on fostering openness in music research. It is argued that the "best practice" suggested by the FAIR principles is too demanding in many cases, but "good enough practice" may be within reach for many. A four-layer data handling "recipe" is suggested as concrete advice for achieving "good enough practice" in empirical music research.
Masu, Raul; Melbye, Adam Pultz; Sullivan, John & Jensenius, Alexander Refsum
(2021).
NIME and the Environment: Toward a More Sustainable NIME Practice.
I Dannenberg, Roger & Xiao, Xiao (Red.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
The International Conference on New Interfaces for Musical Expression.
ISSN 2220-4792.Fulltekst i vitenarkiv
Laczkó, Bálint & Jensenius, Alexander Refsum
(2021).
Reflections on the Development of the Musical Gestures Toolbox for Python.
I Kantan, Prithvi Ravi; Paisa, Razvan & Willemsen, Silvin (Red.),
Proceedings of the Nordic Sound and Music Computing Conference.
Aalborg University Copenhagen.
Fulltekst i vitenarkivVis sammendrag
The 2nd Nordic Sound and Music Conference will take place online on 11-12 November 2021, and will be organized by the Sound and Music Computing group at the Department of Architecture, Design and Media Technology at Aalborg University, Copenhagen. The conference is free of charge, both for submitting authors and other attendees.
NordicSMC 2021 welcomes both scientific and music-based contributions that lie within the scope of the Sound and Music Computing (SMC) field, especially those relevant to the featured topic of this year’s conference:
Interdisciplinary Perspectives on Research in Sound and Music Computing.
Through this topic, we aim to foster SMC research by bringing together the promising new generation of talented young researchers and students. Our goal is to provide them with the full online conference experience in the presence of some of the most renowned veterans of the field. As such, we strongly encourage submissions from PhD students, master students, advanced bachelor students and early stage researchers, although all within the SMC community are most welcome to submit and participate.
The conference will include a hands-on workshop on the JUCE framework with Silvin Willemsen, as well as a Keynote by Ludvig Elblaus and a panel discussion on Rigorous Empirical Evaluation of SMC Research moderated by Sofia Dahl.
Zelechowska, Agata; Gonzalez-Sanchez, Victor E.; Laeng, Bruno & Jensenius, Alexander Refsum
(2020).
Headphones or Speakers? An Exploratory Study of Their Effects on Spontaneous Body Movement to Rhythmic Music.
Frontiers in Psychology.
ISSN 1664-1078.
11(698).
doi: 10.3389/fpsyg.2020.00698.
Fulltekst i vitenarkiv
Bishop, Laura & Jensenius, Alexander Refsum
(2020).
Reliability of two infrared motion capture systems in a music performance setting.
I Spagnol, Simone & Valle, Andrea (Red.),
Proceedings of the 17th Sound and Music Computing Conference.
Axea sas/SMC Network.
ISSN 978-88-945415-0-2.Fulltekst i vitenarkiv
Erdem, Cagri; Jensenius, Alexander Refsum; Glette, Kyrre; Krzyzaniak, Michael Joseph & Veenstra, Frank
(2020).
Air-Guitar Control of Interactive Rhythmic Robots.
Proceedings of the International Conference on Live Interfaces (Proceedings of ICLI).
ISSN 2663-9041.s. 208–210.
After producing ground-breaking computer-based tools to advance the study of human movement, such as the video-visualization techniques contained in the Musical-Gestures Toolbox, Alexander Refsum Jensenius has con-tinued to find more creative and analytical possibilities to intersect our understandings of music and dance. In the current context of technology-assisted misappropriation of tradi-tional songs and dances, I interviewed the Deputy Director of the RITMO Centre on how we might revert the link between new technol-ogies and intangible cultural heritage for the benefit of legitimate bearers.
Furthermore, in this interview, Alexander out-lines the embodied and interdisciplinary ap-proach towards music that has grounded the course of his career but even more interesting-ly, he offers insights about the future of expe-riencing dance through technology and the possibility of dancing robots.
Xambó, Anna; Støckert, Robin; Jensenius, Alexander Refsum & Saue, Sigurd
(2020).
Learning to Code Through Web Audio: A Team-Based Learning Approach.
Journal of The Audio Engineering Society.
ISSN 1549-4950.
68(10),
s. 727–737.
doi: 10.17743/jaes.2020.0019.
Fulltekst i vitenarkivVis sammendrag
In this article, we discuss the challenges and opportunities provided by teaching programming using web audio technologies and adopting a team-based learning (TBL) approach among a mix of colocated and remote students, mostly novices in programming. The course has been designed for cross-campus teaching and teamwork, in alignment with the two-city master's program in which it has been delivered. We present the results and findings from (1) students' feedback; (2) software complexity metrics; (3) students' blog posts; and (4) teacher's reflections. We found that the nature of web audio as a browser-based environment, coupled with the collaborative nature of the course, was suitable for improving the students' level of confidence about their abilities in programming. This approach promoted the creation of group course projects of a certain level of complexity, based on the students' interests and programming levels. We discuss the challenges of this approach, such as supporting smooth cross-campus interactions and assuring students' preknowledge in web technologies (HTML, CSS, and JavaScript) for an optimal experience. We conclude by envisioning the scalability of this course to other distributed and remote learning scenarios in academic and professional settings. This is in line with the foreseen future scenario of cross-site interaction mediated through code.
Zelechowska, Agata; Gonzalez Sanchez, Victor Evaristo; Laeng, Bruno; Vuoskoski, Jonna Katariina & Jensenius, Alexander Refsum
(2020).
Who Moves to Music? Empathic Concern Predicts Spontaneous Movement Responses to Rhythm and Music.
Music & Science.
ISSN 2059-2043.
3.
doi: 10.1177/2059204320974216.
Fulltekst i vitenarkivVis sammendrag
Moving to music is a universal human phenomenon, and previous studies have shown that people move to music even when they try to stand still. However, are there individual differences when it comes to how much people spontaneously respond to music with body movement? This article reports on a motion capture study in which 34 participants were asked to stand in a neutral position while listening to short excerpts of rhythmic stimuli and electronic dance music. We explore whether personality and empathy measures, as well as different aspects of music-related behaviour and preferences, can predict the amount of spontaneous movement of the participants. Individual differences were measured using a set of questionnaires: Big Five Inventory, Interpersonal Reactivity Index, and Barcelona Music Reward Questionnaire. Liking ratings for the stimuli were also collected. The regression analyses show that Empathic Concern is a significant predictor of the observed spontaneous movement. We also found a relationship between empathy and the participants’ self-reported tendency to move to music.
Zelechowska, Agata; Gonzalez Sanchez, Victor Evaristo & Jensenius, Alexander Refsum
(2020).
Standstill to the ‘beat’: Differences in involuntary movement
responses to simple and complex rhythms,
AM '20: Proceedings of the 15th International Conference on Audio Mostly.
Association for Computing Machinery (ACM).
ISSN 978-1-4503-7563-4.s. 107–113.
doi: 10.1145/3411109.3411139.
Fulltekst i vitenarkiv
The links between music and human movement have been shown to provide insight into crucial aspects of human’s perception, cognition, and sensorimotor systems. In this study, we examined the influence of music on movement during standstill, aiming at further characterizing the correspondences between movement, music, and perception, by analyzing head sway fractality. Eighty seven participants were asked to stand as still as possible for 500 seconds while being presented with alternating silence and audio stimuli. The audio stimuli were all rhythmic in nature, ranging from a metronome track to complex electronic dance music. The head position of each participant was captured with an optical motion capture system. Long-range correlations of head movement were estimated by detrended fluctuation analysis (DFA). Results agree with previous work on the movement-inducing effect of music, showing significantly greater head sway and lower head sway fractality during the music stimuli. In addition, patterns across stimuli suggest a two-way adaptation process to the effects of music, with musical stimuli influencing head sway while at the same time fractality modulated movement responses. Results indicate that fluctuations in head movement in both conditions exhibit long-range correlations, suggesting that the effects of music on head movement depended not only on the value of the most recent measured intervals, but also on the values of those intervals at distant times.
Xambo Sedo, Anna; Saue, Sigurd; Jensenius, Alexander Refsum; Støckert, Robin & Brandtsegg, Øyvind
(2019).
NIME Prototyping in Teams: A Participatory Approach to Teaching Physical Computing.
I Queiroz, Marcelo & Xambo Sedo, Anna (Red.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
Universidade Federal do Rio Grande do Sul.
ISSN 2220-4792.s. 216–221.Fulltekst i vitenarkiv
Erdem, Cagri; Schia, Katja Henriksen & Jensenius, Alexander Refsum
(2019).
Vrengt: A Shared Body–Machine Instrument for Music–Dance Performance.
I Visi, Federico (Red.),
Music Proceedings of the International Conference on New Interfaces for Musical Expression.
Universidade Federal do Rio Grande do Sul.
ISSN 2220-4792.Fulltekst i vitenarkiv
Erdem, Cagri; Schia, Katja Henriksen & Jensenius, Alexander Refsum
(2019).
Vrengt: A Shared Body–Machine Instrument for Music–Dance Performance.
I Queiroz, Marcelo & Xambo Sedo, Anna (Red.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
Universidade Federal do Rio Grande do Sul.
ISSN 2220-4792.s. 477–482.Fulltekst i vitenarkiv
Virtuosity in music performance is often associated with fast, precise, and efficient sound-producing movements. The generation of such highly skilled movements involves complex joint and muscle control by the central nervous system, and depends on the ability to anticipate, segment, and coarticulate motor elements, all within the biomechanical constraints of the human body. When successful, such motor skill should lead to what we characterize as fluency in musical performance. Detecting typical features of fluency could be very useful for technology-enhanced learning systems, assisting and supporting students during their individual practice sessions by giving feedback and helping them to adopt sustainable movement patterns.
In this study, we propose to assess fluency in musical performance as the ability to smoothly and efficiently coordinate while accurately performing slow, transitionary, and rapid movements. To this end, the movements of three cello players and three drummers at different levels of skill were recorded with an optical motion capture system, while a wireless electromyogrphy (EMG) system recorded the corresponding muscle activity from relevant landmarks. We analyze the kinematic and coarticulation characteristics of these recordings separately and then propose a combined model of fluency in musical performance predicting music sophistication. Results suggest movements from expert performers' are characterized by consistently smooth strokes and scaling of muscle phasic coactivation. The explored model of fluency as function of movement smoothness and coarticulation patterns was shown to be limited by the sample size but serves as a proof of concept. Results from this study show the potential of a technology-enhanced objective measure of fluency in musical performance, which could lead to improved practices for aspiring musicians, instructors, and researchers.
Xambo Sedo, Anna; Støckert, Robin; Jensenius, Alexander Refsum & Saue, Sigurd
(2019).
Facilitating Team-Based Programming Learning with Web Audio.
I Xambo Sedo, Anna; Martin, Sara R. & Roma, Gerard (Red.),
Proceedings of the International Web Audio Conference.
NTNU.
ISSN 2663-5844.s. 2–7.Fulltekst i vitenarkiv
Alarcón Diaz, Ximena & Jensenius, Alexander Refsum
(2019).
“Ellos no están entendiendo nada” [“They are not understanding anything”]: embodied remembering as complex narrative in a Telematic Sonic Improvisation.
I Søndergaard, Morten & Beloff, Laura (Red.),
Proceedings of RE:SOUND 2019.
British Computer Society (BCS).
ISSN 1477-9358.
doi: 10.14236/ewic/resound19.32.
Fulltekst i vitenarkiv
The relationships between human body motion and music have been the focus of several studies characterizing the correspondence between voluntary motion and various sound features. The study of involuntary movement to music, however, is still scarce. Insight into crucial aspects of music cognition, as well as characterization of the vestibular and sensorimotor systems could be largely improved through a description of the underlying links between music and involuntary movement. This study presents an analysis aimed at quantifying involuntary body motion of a small magnitude (micromotion) during standstill, as well as assessing the correspondences between such micromotion and different sound features of the musical stimuli: pulse clarity, amplitude, and spectral centroid. A total of 71 participants were asked to stand as still as possible for 6 min while being presented with alternating silence and music stimuli: Electronic Dance Music (EDM), Classical Indian music, and Norwegian fiddle music (Telespringar). The motion of each participant's head was captured with a marker-based, infrared optical system. Differences in instantaneous position data were computed for each participant and the resulting time series were analyzed through cross-correlation to evaluate the delay between motion and musical features. The mean quantity of motion (QoM) was found to be highest across participants during the EDM condition. This musical genre is based on a clear pulse and rhythmic pattern, and it was also shown that pulse clarity was the metric that had the most significant effect in induced vertical motion across conditions. Correspondences were also found between motion and both brightness and loudness, providing some evidence of anticipation and reaction to the music. Overall, the proposed analysis techniques provide quantitative data and metrics on the correspondences between micromotion and music, with the EDM stimulus producing the clearest music-induced motion patterns. The analysis and results from this study are compatible with embodied music cognition and sensorimotor synchronization theories, and provide further evidence of the movement inducing effects of groove-related music features and human response to sound stimuli. Further work with larger data sets, and a wider range of stimuli, is necessary to produce conclusive findings on the subject.
Gonzalez Sanchez, Victor Evaristo; Martin, Charles Patrick; Zelechowska, Agata; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria Kristine Å & Jensenius, Alexander Refsum
(2018).
Bela-based augmented acoustic guitars for sonic microinteraction.
I Dahl, Luke; Bowman, Doug & Martin, Tom (Red.),
Proceedings of the International Conference On New Interfaces For Musical Expression.
Virginia Tech.
ISSN 2220-4792.s. 324–327.Fulltekst i vitenarkiv
Martin, Charles Patrick; Jensenius, Alexander Refsum & Tørresen, Jim
(2018).
Composing an ensemble standstill work for Myo and Bela.
I Dahl, Luke; Bowman, Doug & Martin, Tom (Red.),
Proceedings of the International Conference On New Interfaces For Musical Expression.
Virginia Tech.
ISSN 2220-4792.s. 196–197.Fulltekst i vitenarkiv
Jensenius, Alexander Refsum
(2018).
The Musical Gestures Toolbox for Matlab.
I Gómez, Emilia; Hu, Xiao; Humphrey, Eric & Benetos, Emmanouil (Red.),
Proceedings of the 19th International Society for Music Information Retrieval Conference.
Institut de Recherche et Coordination Acoustique/Musique.
ISSN 978-2-9540351-2-3.Fulltekst i vitenarkiv
The Handbook of Systematic Musicology is trying to reflect this interdisciplinary nature of Systematic Musicology in seven sections. These sections follow the main topics in the field, Musical Acoustics, Signal Processing, Music Psychology, Psychophysics/Psychoacoustics and Music Ethnology while also taking recent research trends into consideration, like Embodied Music Cognition and Media Applications. Other topics, like Music Theory or Philosophy of Music are incorporated in the respective sections.
Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Jensenius, Alexander Refsum
(2018).
Muscle activity response of the audience during an experimental music performance.
I Cunningham, Stuart (Red.),
Proceedings of the Audio Mostly 2018 on Sound in Immersion and Emotion.
Association for Computing Machinery (ACM).
ISSN 978-1-4503-5373-1.
doi: 10.1145/3243274.3243278.
Fulltekst i vitenarkiv
What is a musical instrument? What are the musical instruments of the future? This anthology presents thirty papers selected from the fifteen year long history of the International Conference on New Interfaces for Musical Expression (NIME). NIME is a leading music technology conference, and an important venue for researchers and artists to present and discuss their explorations of musical instruments and technologies.
Each of the papers is followed by commentaries written by the original authors and by leading experts. The volume covers important developments in the field, including the earliest reports of instruments like the reacTable, Overtone Violin, Pebblebox, and Plank. There are also numerous papers presenting new development platforms and technologies, as well as critical reflections, theoretical analyses and artistic experiences.
The anthology is intended for newcomers who want to get an overview of recent advances in music technology. The historical traces, meta-discussions and reflections will also be of interest for longtime NIME participants. The book thus serves both as a survey of influential past work and as a starting point for new and exciting future developments.
Jensenius, Alexander Refsum; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Bjerkestrand, Kari Anne Vadstensvik
(2017).
Exploring the Myo controller for sonic microinteraction.
I Erkut, Cumhur (Red.),
Proceedings of the International Conference on New Interfaces for Musical Expression.
Aalborg University Copenhagen.
ISSN 2220-4792.s. 442–445.Fulltekst i vitenarkiv
Jensenius, Alexander Refsum
(2017).
Sonic Microinteraction in "the Air".
I Lesaffre, Micheline; Maes, Pieter-Jan & Leman, Marc (Red.),
The Routledge Companion to Embodied Music Interaction.
Routledge.
ISSN 9781138657403.
doi: 10.4324/9781315621364-47.
Fulltekst i vitenarkiv
Jensenius, Alexander Refsum
(2017).
Exploring music-related micromotion.
I Wöllner, Clemens (Red.),
Body, Sound and Space in Music and Beyond: Multimodal Explorations.
Routledge.
ISSN 9781472485403.
doi: 10.4324/9781315569628-3.
Fulltekst i vitenarkiv
Jensenius, Alexander Refsum; Zelechowska, Agata & Gonzalez Sanchez, Victor Evaristo
(2017).
The Musical Influence on People's Micromotion when Standing Still in Groups.
I Lokki, Tapio; Pätynen, Jukka & Välimäki, Vesa (Red.),
Proceedings of the 14th Sound and Music Computing Conference 2017.
Aalto University.
ISSN 978-952-60-3729-5.s. 195–200.Fulltekst i vitenarkiv
Between January 25th and 27th, FAIRsFAIR partners and stakeholders will meet for a series of concluding meetings to deep-dive into the results of FAIRsFAIR. We’ll analyse the impact that we managed to have on the European Research Community. We'll go once more through the tools, guidelines and best practices that we have produced and delivered to researchers, data stewards, decision makers and funders towards a better, more structured approach towards FAIR data management. We’ll take the recommendations we produced and the lessons we learnt and leave them as a legacy for future activities to come.
Hva får oss ut på dansegulvet? Det er forsket på hva som skal til for at vi beveger oss til musikk. Det er helt spesielle rytmer som får kroppen vår til å bevege seg, enten vi vil eller ei. Professor med bakgrunn både fra musikk, fysikk og matematikk forklarer.
This paper addresses environmental issues around NIME research and practice. We discuss the formulation of an environmental statement for the conference as well as the initiation of a NIME Eco Wiki containing information on environmental concerns related to the creation of new musical instruments. We outline a number of these concerns and, by systematically reviewing the proceedings of all previous NIME conferences, identify a general lack of reflection on the environmental impact of the research undertaken. Finally, we propose a framework for addressing the making, testing, using, and disposal of NIMEs in the hope that sustainability may become a central concern to researchers.
In this workshop, hosted by the three NIME environmental officers, participants will be introduced to the NIME Eco Wiki, a repository for addressing environmental and sustainability issues within the NIME community. During the workshop, the participants will discuss how practices on the communal as well as the individual level may become more sustainable and they will create new additions and ideas for the Wiki.
The paper presents the Musical Gestures Toolbox (MGT) for Python, a collection of modules targeted at researchers working with video recordings. The toolbox includes video visualization techniques such as creating motion videos, motion history images, and motiongrams. These visualizations allow for studying video recordings from different temporal and spatial perspectives. The toolbox also includes basic computer vision methods, and it is designed to integrate well with audio analysis toolboxes. The MGT was initially developed to analyze music-related body motion (of musicians, dancers, and perceivers) but is equally helpful for other disciplines working with video recordings of humans, such as linguistics, pedagogy, psychology, and medicine.
This dataset comprises head motion observations collected as part of an experiment in which a group of people were asked to stand still for 6 minutes, with the first 3 minutes in silence, followed by 3 minutes with music. Head motion was captured in units of mm at 100Hz using a Qualisys infra-red optical system. The experiment was carried out at the University of Oslo on March 8th 2012 from a total of 113 participants. Code to read and process these files is available. The paper corresponding to the work is Jensenius et al., "The Musical Influence on People's Micromotion when Standing Still in Groups", Proceedings of the 14th Sound and Music Computing Conference (2017).
The potential of Citizen Science is high on the agenda in the discussion on the future of academic research. The European Commission’s Communication “A new ERA for Research and Innovation”, published in September 2020, states that “[...] the engagement of citizens, local communities and civil society will [help] achieve greater social impact and increased trust in science.” Citizens can contribute in diverse ways, ranging from data collection over data analysis to co-designing projects, and thereby bring academic research and its outcomes closer to society.
However, Citizen Science also accentuates ethical and legal questions about ownership of the research process and outcomes, and poses challenges in terms of safeguarding research quality. Addressing these challenges and using the opportunities of Citizen Science will require universities to take the lead and consider the place of Citizen Science within their institutional strategies, as well as the support they offer to research staff.
Engaging in inclusive and transparent science, Citizen Science and Open Science are becoming increasingly intertwined. Currently, Citizen Science is described by the European Commission as “both an aim and enabler of Open Science”.
This joint workshop will discuss themes around institutional support for Citizen Science and offers an opportunity to transfer and share knowledge. The aim is to exchange experiences, lessons learnt, and explore common challenges. To support Citizen Science, the online workshop will discuss tools, guidelines and good practices from Open Science experiences as well. Participating universities will have the opportunity to share expertise, coordinate efforts and exchange advice on services, tools and legal and ethical issues.
This project contains head movement data recorded from groups of participants asked to stand as still as possible and presented with a series of auditory stimuli. Data was collected in units of mm with a Qualisys motion capture system at 100Hz. Data was collected at the University of Oslo on March 12th 2015 from a total of 108 participants. Code to read and process these files has been made publicly available.
Jensenius, Alexander Refsum & Ingebrethsen, Christian
(2020).
Kan en datamaskin lage den neste megahiten?
[Radio].
NRK P2.
Jensenius, Alexander Refsum
(2020).
Video Visualization Strategies at RITMO.
Not moving to dance music is near impossible, according to new research.
Zelechowska, Agata
(2020).
Irresistible Movement: The Role of Musical Sound, Individual Differences and Listening Context in Movement Responses to Music.
Jensenius, Alexander Refsum & Svendsen, Njord Vegard
(2020).
Spelelystene.
[Avis].
Khrono.
Vis sammendrag
For somme er ein arbeidsdag utan musikk utenkeleg.
An interactive art installation presented during the International Conference on New Interfaces for Musical Expression (NIME) 2020.
Zelechowska, Agata; Gonzalez Sanchez, Victor Evaristo & Jensenius, Alexander Refsum
(2020).
Standstill to the ‘beat’: Differences in involuntary movement
responses to simple and complex rhythms.
Vis sammendrag
Previous studies have shown that movement-inducing properties of music largely depend on the rhythmic complexity of the stimuli. However, little is known about how simple isochronous beat patterns differ from more complex rhythmic structures in their effect on body movement. In this paper we study spontaneous movement of 98 participants instructed to stand as still as possible for 7 minutes while listening to silence and randomised sound excerpts: isochronous drumbeats and complex drum patterns, each at three different tempi (90, 120, 140 BPM). The participants’ head movement was recorded with an optical motion capture system.We found that on average participants moved more during the sound stimuli than in silence, which confirms the results from our previous studies. Moreover, the stimulus with complex drum patterns elicited more movement when compared to the isochronous drum beats. Across different tempi, the participants moved most at 120 BPM for the average of both types of stimuli. For the isochronous drumbeats, however, their movement was highest at 140 BPM. These results can contribute to our understanding of the interplay between rhythmic complexity, tempo and music-induced movement.
Malec, Monika & Zelechowska, Agata
(2020).
Pocztówka dźwiękowa.
[Radio].
Polskie Radio Lublin.
Nå er det bevist: Det finnes en indre danseløve i oss alle. Ny forskning viser dessuten at den musikken som ikke får det til å rykke i dansefoten er norsk.
What is an instrument in our increasingly electrified world? In this talk I will present a set of theoretical building blocks from my forthcoming book on "musicking in an electronic world". At the core of the argument is the observation that the introduction of new music technologies has led to an increased separation between action and sound in musical performance. This has happened gradually, with pianos and organs being important early examples of instruments that introduced mechanical components between the performer and resonating objects. Today's network-based instruments represent an extreme case of a spatiotemporal dislocation between action and sound. They challenge our ideas of what an instrument can be, who can perform on them, and how they should be analyzed. In the lecture I will explain how we can use the concepts of action-sound couplings and mappings to structure our thinking about such instruments. This will be used at the heart of a new organology that embraces the qualities of both acoustic and electroacoustic instruments.
Jensenius, Alexander Refsum
(2019).
Sound actions: An embodied approach to a digital organology.
Vis sammendrag
What is an instrument in our increasingly electrified world? In this talk I will present a set of theoretical building blocks from my forthcoming book on "musicking in an electronic world". At the core of the argument is the observation that the introduction of new music technologies has led to an increased separation between action and sound in musical performance. This has happened gradually, with pianos and organs being important early examples of instruments that introduced mechanical components between the performer and resonating objects. Today's network-based instruments represent an extreme case of a spatiotemporal dislocation between action and sound. They challenge our ideas of what an instrument can be, who can perform on them, and how they should be analyzed. In the lecture I will explain how we can use the concepts of action-sound couplings and mappings to structure our thinking about such instruments. This will be used at the heart of a new organology that embraces the qualities of both acoustic and electroacoustic instruments.
Gonzalez Sanchez, Victor Evaristo
(2019).
MICRO: Human Bodily Micromotion in Music Perception and Interaction.
Vis sammendrag
This talk will highlight links between music and human movement, aiming at providing insight into crucial aspects of human perception, cognition, and sensorimotor systems. It will analyze responses to a wide range of music and sound features, exploiting concepts such as the groove, embodied music cognition, and entrainment. Victor will be glad to discuss potential implications of movement-analysis research for embodiment perspectives on technologically enabled conceptual learning.
The influence of acoustic stimuli and feedback in sport has been explored as means of optimizing technique, in particular during training. Interactive and adaptive acoustic systems have been evaluated for rowing, with results showing a significant increase in boat velocity. However, assessment of the effects of acoustic feedback and pacing in the technical aspects of rowing is still scarce. Previous studies on the smoothness of the stroke force profile have shown that smoothness metrics can qualitatively reflect movement coordination. In this study, we quantify and compare hand movement smoothness from rowers performing under three acoustic conditions: silence, verbal instructions, and acoustic pacing.
Alarcón Diaz, Ximena & Jensenius, Alexander Refsum
(2019).
"Ellos no están entendiendo nada" ("They are not understanding anything"): Listening to Embodied Memories of Colombian Migrant Women Reflecting on Conflict and Migration.
Vis sammendrag
Exploring the role of the body as an interface that keeps memory of place, INTIMAL physical-virtual “embodied” system, integrates body movement, voice, and an oral archive, as an artistic platform for relational listening (Alarcón, 2019), using networking technologies for telematic sonic improvisatory performances, in the context of geographical migration. INTIMAL has been informed by a case study with nine Colombian migrant
women in Europe, listening to their migrations, and to an oral archive with testimonies of conflict and migration by other Colombian migrant women.1 The first two interfaces created for the system: MEMENTO (a spoken word retrieval system), and RESPIRO (for transmission and sonification of breathing data), have been tested by the research participants in a telematic sonic improvisatory public improvisatory performance between the cities of Oslo, Barcelona, and London. In the performance, proposed as a shared dream, a “complex narrative” (Grishakova & Poulaki, 2019) emerged, for both the improvisers and the audiences. In this paper, we describe the conditions of the narrative environment, and the embodied expressions that
emerged—including body movement, voice, spoken word, and breathing—establishing connections between gendered migration, and Colombian conflict. We reflect on how distributed improvisatory embodied
expression, and relational listening through technological mediations, aids the process of collective remembering (Wertsch, 2001), in a complex context of conflict and migration.
In this episode, we talk about Music Research, and how it is to practice open research within this field.
Our guest is Alexander Jensenius, Associate Professor at the Department of Musicology
- Centre for Interdisciplinary Studies in Rhythm, Time and Motion (IMV) at the University of Oslo. He is also behind MusicLAb, an event-based project where data is collected, during a musical performance, and analyzed on the fly.
Becker, Artur; Herrebrøden, Henrik; Gonzalez Sanchez, Victor Evaristo; Nymoen, Kristian; Dal Sasso Freitas, Carla Maria & Tørresen, Jim
[Vis alle 7 forfattere av denne artikkelen](2019).
Functional Data Analysis of Rowing Technique Using Motion Capture Data.
Vis sammendrag
We present an approach to analyzing the motion capture data ofrowers using bivariate functional principal component analysis(bfPCA). The method has been applied on data from six elite rowersrowing on an ergometer. The analyses of the upper and lower bodycoordination during the rowing cycle revealed significant differences between the rowers, even though the data was normalized toaccount for differences in body dimensions. We make an argumentfor the use of bfPCA and other functional data analysis methods forthe quantitative evaluation and description of technique in sports.
Er du av typen som berre må danse når du høyrer ein viss type musikk? Paul Arvid Jørgensen har møtt ein forskar som ser på om vi menneske er fødd med dansefot, og om ein type musikk fører til meir rørsle enn en annan.
I will present the spatiotemporal matrix, a system for categorizing human actions into different spatial and temporal levels: micro, meso, macro. Most regular human actions would be categorized as meso-meso, that is, medium-sized motion within a timespan that fits our short-term memory. Exploring combinations of micro and macro levels in both space and time is challenging, but is also conceptually, practically and artistically interesting. I will show an example of this from the music-dance performance Sverm, and explain how the matrix was informed by my research into the effect of music on the micromotion observed when people try to stand still.
The AAAI project holds a final workshop showcasing instruments developed and techniques explored. The workshop also consists of a performance with new pieces for augmented guitars, violins, double basses, ukuleles, as well as six self-playing guitars.
Jensenius, Alexander Refsum; Erdem, Cagri; Zelechowska, Agata; Lan, Qichao; Fuhrer, Julian Peter & Gonzalez Sanchez, Victor Evaristo
(2019).
Entraining Guitars.
Is it possible to do experimental music research completely openly? And what can we gain by opening up the research process from beginning to end? In the talk I will present MusicLab, an open research project at the University of Oslo. The aim is to explore new methods for conducting research, research communication, and education. Each MusicLab event is organized around a public music performance, during which we collect data from both musicians and audience members. Here we explore different types of sensing systems that work in real-world contexts, such as breathing, heartbeat, muscle tension, or motion. The events also contain an edutainment element through panel discussions as well as "data jockeying" in the form of live data analysis. The collected data is made publicly available, and forms the basis for further analysis and publications after the event. Opening up the research process is conceptually, practically, and technologically challenging for everyone involved. The benefit is that it has helped us solve a number of issues when it comes to GDPR and copyright. It has also pushed our research in directions that we previously had never thought about, and helped us communicate this to new users.
In this installation we explore how six self-playing guitars can entrain to each other. When they are left alone they will revert to playing a common pulse. As soon as they sense people in their surroundings they will start entraining to other pulses. The result is a fascinating exploration of a basic physical and cognitive concept, and the musically interesting patterns that emerge on the border between order and chaos.
In this paper, we present a course of audio programming using web audio technologies addressed to an interdisciplinary group of master students who are mostly beginners in programming. This course is held in two connected university campuses through a portal space and the students are expected to work in cross-campus teams. The workshop promotes both individual and group work and is based on ideas from science, technology, engineering, arts and mathematics (STEAM), team-based learning and project-based learning. We show the outcomes of this course, discuss the students’ feedback and reflect on the results. We found that it is important to provide individual vs. group work, to use the same code editor for consistent follow-up and to be able to share the screen to solve individual questions. Other aspects inherent to the master (intensity of the courses, coding in a research-oriented program) and to prior knowledge (web technologies) should be reconsidered. We conclude with a wider reflection on the challenges and potentials of using web audio as a programming environment for beginners in STEAM and distance-learning courses.
MusicLab er et samarbeidsprosjekt mellom UB og RITMO og en pilot på åpen forskning ved UiO. Konseptet kombinerer live musikk, live forskning og vitenskapsformidling. Det er mye vi har fått til med MusicLab, men i slikt nybrottsarbeid støter man også på nye typer utfordringer. Hvor langt kan man trekke åpenheten?
Jensenius, Alexander Refsum; Duch, Michael Francis; Langdalen, Jørgen; Åse, Tone; Larsen, Edvine & Østern, Tone Pernille
(2018).
Kunsten å forske.
Vis sammendrag
Hva er kunstnerisk forskning? Hvorfor heter det kunstnerisk utviklingsarbeid og ikke kunstnerisk forskning når det heter det i andre land? Er det sånn at kunstnerisk forskning skiller seg fra all annen type forskning, og i så fall hvorfor?
This article describes the design and construction of a collection of digitally-controlled augmented acoustic guitars, and the use of these guitars in the installation \textit\{Sverm-Resonans\}. The installation was built around the idea of exploring `inverse’ sonic microinteraction, that is, controlling sounds by the micromotion observed when attempting to stand still. It consisted of six acoustic guitars, each equipped with a Bela embedded computer for sound processing (in Pure Data), an infrared distance sensor to detect the presence of users, and an actuator attached to the guitar body to produce sound. With an attached battery pack, the result was a set of completely autonomous instruments that were easy to hang in a gallery space. The installation encouraged explorations on the boundary between the tactile and the kinesthetic, the body and the mind, and between motion and sound. The use of guitars, albeit with an untraditional `performance’ technique, made the experience both familiar and unfamiliar at the same time. Many users reported heightened sensations of stillness, sound, and vibration, and that the `inverse’ control of the instrument was both challenging and pleasant.
This paper describes the process of developing a standstill performance work using the Myo gesture control armband and the Bela embedded computing platform. The combination of Myo and Bela allows a portable and extensible version of the standstill performance concept while introducing muscle tension as an additional control parameter. We describe the technical details of our setup and introduce Myo-to-Bela and Myo-to-OSC software bridges that assist with prototyping compositions using the Myo controller.
Jensenius, Alexander Refsum; Martin, Charles Patrick; Bjerkestrand, Kari Anne Vadstensvik & Johnson, Victoria
(2018).
Stillness under Tension.
Martin, Charles Patrick; Xambó, Anna; Visi, Federico; Morreale, Fabio & Jensenius, Alexander Refsum
(2018).
Stillness under Tension.
Vis sammendrag
Stillness Under Tension is an ensemble standstill work for Myo gesture control armband and Bela embedded music platform. Humans are incapable of standing completely still due to breathing and other involuntary micromotions. This work explores the expressive space of standing still through an inverse action-sound mapping: less movement leads to more sound. Four performers stand as still as possible on stage, each wearing a Myo armband connected to a Bela embedded sound processing platform. The Myo is used to measure the performers movement, and the muscle activity in their forearm which they can use--both voluntarily and involuntarily--to control a synthesised sound world. Each performer uses one Myo and Bela in a musical space defined by their physical position and posture while standing still.
Hvorfor får vi lyst til å bevege oss når vi hører musikk? Vinnerne av UiOs innovasjonspris, Anne Danielsen og Alexander Refsum Jensenius, finner forhåpentligvis svaret når de fordyper seg i mennesket og rytmens mysterier.
Hvorfor opplever vi rytmer og tid slik vi gjør? "RITMO Senter for tverrfaglig forskning på rytme, tid og bevegelse" vil gi oss svaret.
Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Jensenius, Francesca R.
(2018).
Sverm-Pluck.
Sverm-Resonans is an interactive installation developed for the Ultima Contemporary Music Festival in Oslo which features 6 suspended guitars, each fitted with an actuator, distance sensor and a Bela.
Det er umulig å stå stille. Kroppen lever og beveger seg hele tiden. Selv om man forsøker å stå i ro, klarer man det ikke helt. Så hvor stille står vi egentlig? Og hvordan påvirker musikk oss når vi står stille?
Musikkforsker har laget et dataprogram for å måle bevegelsene til dansere. Nå bruker medisinere verktøyet hans til å avsløre om små babyer har cerebral parese.
Fuhrer, Julian Peter; Glette, Kyrre & Jensenius, Alexander Refsum
(2018).
Interactive Animation of the RITMO Logo.
Vis sammendrag
In this project the logo of RITMO is installed in an interactive animation. It is able to move in accordance with the frequency band of an audio input stream. That is to say, the RITMO logo interacts with the rhythmical streams of music.
Hvordan kan teknologi og musikk bli til noe veldig spennende? Og hva er unikt med måten vi beveger oss på? Dagens podkastgjest er førsteamanuensis i musikkteknologi, Alexander Jensenius.
I episode #89 av podkastserien ‘De som bygger det nye Norge med Silvija Seres’ snakker Jensenius om hvordan tverrfaglighet mellom uvanlige fag og disipliner åpne nye måter å tenke på.
Han snakker også om hvordan han bruker «motion capture» til å studere mennesker i bevegelse, hva rytme betyr for mennesker og hans beste råd til unge forskere.
How much do people move when they try to stand still? Does listening to music influence your micromotion? Can we use micromotion in human-computer interaction? In this presentation, music technologist Alexander Refsum Jensenius (RITMO, UiO) will share some results from his research on human cognition on the boundaries between the conscious and the unconscious, the voluntary and the involuntary.
How much do people move when they try to stand still? Does listening to music influence your micromotion? Can we use micromotion in human-computer interaction? In this presentation, music technologist Alexander Refsum Jensenius (RITMO, UiO) will share some results from his research on human cognition on the boundaries between the conscious and the unconscious, the voluntary and the involuntary.
Jensenius, Alexander Refsum; Adde, Lars & Flydal, Lars O.
(2018).
Forskningsmøte mellom musikk og medisin.
[Avis].
Vårt Land.
Vis sammendrag
Tverrfaglig nytte: Forskning på kroppens rytmer og bevegelser har skapt nye diagnoseverktøy som gjør at cerebral parese kan påpekes tidligere.
Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata; Erdem, Cagri & Jensenius, Alexander Refsum
(2018).
Stillness under Tension.
Martin, Charles Patrick; Lesteberg, Mari; Jawad, Karolina; Aandahl, Eigil; Xambó, Anna & Jensenius, Alexander Refsum
(2018).
Stillness under Tension.
Jensenius, Alexander Refsum
(2018).
The Musical Gestures Toolbox for Matlab.
Vis sammendrag
The Musical Gestures Toolbox for Matlab (MGT) aims at
assisting music researchers with importing, preprocessing,
analyzing, and visualizing video, audio, and motion capture data in a coherent manner within Matlab.
This exploratory study investigates muscular activity characteristics of a group of audience members during an experimental music performance. The study was designed to be as ecologically valid as possible, collecting data in a concert venue and making use of low-invasive measurement techniques. Muscle activity (EMG) from the forearms of 8 participants revealed that sitting in a group could be an indication of a level of group engagement, while comparatively greater muscular activity from a participant sitting at close distance to the stage suggests performance-induced bodily responses. The self-reported measures rendered little evidence supporting the links between muscular activity and live music exposure, although a larger sample size and a wider range of music styles need to be included in future studies to provide conclusive results.
Bøhn, Einar Duenger; Smajdor, Anna Colette & Jensenius, Alexander Refsum
(2018).
Humaniora og teknologi.
Danielsen, Anne & Jensenius, Alexander Refsum
(2018).
Rytme i tid og rom.
Zelechowska, Agata; Gonzalez Sanchez, Victor Evaristo & Jensenius, Alexander Refsum
(2018).
How music moves us? Studying human body micromotion in music perception.
Vis sammendrag
Music has the power to influence not only our thoughts and emotions, but also various physiological processes in our bodies. Furthermore, it often encourages physical movement of the listener. While there are numerous studies describing spontaneous psychophysiological responses to music that are linked with emotions, spontaneous body movement to music has became a topic of exploration relatively recently. Mostly, it has been studied in the context of free dance (Burger et al., 2013) or synchronization to musical rhythm while performing repetitive movements such as walking (Styns et al., 2007). But what can we observe if the participants are just standing still? In our project “MICRO - Human Bodily Micromotion in Music Perception and Interaction” we focus on movements so small that they can be unnoticed both by observer and performer, and that can happen involuntarily. This is what we call “micromotion” of the human body. To see how these small movements are affected by music, we develop different experiments using mainly motion capture technology, but also physiological measures such as electromyography (EMG). In this presentation I would like to describe some of our research methods, findings and plans.
In one of the experiment paradigms, disguised as the “Norwegian Championship of Standitill”, we invite groups of participants to the laboratory and ask them to stand as still as possible while we present them with segments of selected music or silence. The head motion of each participant is captured using an infrared optical system. In 2012, 91 subjects stood on the floor for 3 minutes in silence and 3 minutes listening to music of increasing level of rhythmicality and energy (Jensenius et al., 2017). In 2017, 71 participants listened to 6 minutes consisting of segments of silence alternating with electronic dance music (EDM), classical Indian music or Norwegian fiddle music. In both studies we observed higher mean quantity of motion of the participants (QoM) in music condition compared to silence condition, and the effect was driven mostly by EDM. We also observed correlations between QoM and participant’s age, height and standing strategy (locked knees), although these results are mixed between the two studies. The future goal is to look more closely into specific features in music that correspond with observed movement, to search for signs of rhythmic entrainment, and to see what demographic and psychological factors might contribute to interpersonal differences in music induced body micromotion.
References:
Burger, B., Thompson, M. R., Luck, G., Saarikallio, S., & Toiviainen, P. (2013). Influences of rhythm-and timbre-related musical features on characteristics of music-induced movement. Frontiers in psychology, 4, 183.
Jensenius, A. R., Zelechowska, A., & Gonzalez Sanchez, V. E. (2017). The Musical Influence on People's Micromotion when Standing Still in Groups. In Proceedings of the SMC Conferences (pp. 195-200). Aalto University.
Styns, F., van Noorden, L., Moelants, D., & Leman, M. (2007). Walking on music. Human movement science, 26(5), 769-785.
Can
we better understand the complex process of human music perception through a standardisation of the quantification of involuntary correspondences between motion and music? Exploring the frequency-domain and time-domain links between sound and motion signals
in a systematic manner might encourage international and interdisciplinary collaboration, boosting developments in the field and knowledge transferability.
Introduction
Postural stability have been the focus of a number of studies on fall prevention and sports, with an emphasis on walking dynamics1 . Fewer studies have aimed at understanding the influence of sound stimuli in standing posture sway 2.
Although the vestibular system plays a fundamental role in the control of postural stability, it has also been shown to be key in embodied cognition processes 3. It is in part through the vestibular system that music activates motor areas in the brain to induce movement, while body movement enhances the cognitive processing of sound and music 3.
This study explored the influence of music on postural control by measuring synchronization between body center of mass (COM) sway with music.
Methods
7 women (32 ± 4.39 years, 1.73 ± 0.04 m, mean ± SD), and 5 men (29.67 ± 4.63 years, 1.81 ± 0.04 m) participated in the study. Participants were asked to stand still for 6 minutes as they were presented with alternating segments of silence and music. COM movements were measured from the position of a passive marker placed in the midline of the sacrum, recorded using an infrared motion capture system. Radial and vertical COM movements were cross-correlated with the pulse clarity, RMS, and spectral centroid of the stimuli.
Results
Paired samples t-test revealed differences in COM radial and vertical sway between silent and music conditions to be significant at the 0.05 level.
A repeated measures ANOVA showed a significant effect of the stimuli on COM sway (p < 0.05).
The effect of the stimuli on the lag of maximum cross-correlation (delay) between COM radial sway and RMS was shown to be significant (p < 0.05). Differences in delay between pulse clarity and COM vertical sway were significant between stimuli (p < 0.05 ).
Discussion
Results suggest that the effect of RMS in music-induced postural sway might be predominant in the radial plane, with anticipatory behavior observed for stimuli with low RMS.
Vertical sway correspondence patterns suggest anticipatory vertical motion to music spectral centroid.
A more robust understanding of a range of music features and their links with induced movement could lead to insight into the role of the vestibular and sensory systems in balance control.
References
1 Cimolin, V., Galli, M. (2014). Summary measures for clinical gait analysis: A literature review. Gait & Posture 39, 1005-1010.
2 Ross, J. M., Warlaumont, A. S., Abney, D. H., Rigoli, L. M., and Balasubramaniam, R. (2016). Influence of musical groove on postural sway. Journal of Experimental Psychology: Human Perception and Performance Advance online publication.
3 Todd, N. P. (1999). Motion in music: A neurobiological perspective. Music Perception: An Interdisciplinary Journal 17, 115–126.
Jensenius, Alexander Refsum
(2018).
Ny musikkforskning ved RITMO.
Jensenius, Alexander Refsum
(2018).
Fremtiden er analog - perspektiver på humaniora og teknologi.
The Musical Gestures Toolbox (MGT) is a Matlab toolbox for analysing music-related body motion, using sets of audio, video and motion capture data as source material.
Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Donnarumma, Marco; Brean, Are & Bruusgaard, Jo C.
(2017).
Panel: Biophysical Music.
Vis sammendrag
"Biophysical Music" is volume 1 of the new concept "MusicLab", a series of events exploring the science of music from different perspectives. The idea is to mix research and edutainment through hands-on workshops, intellectual warm-ups, performances and data jockeying.
Jensenius, Alexander Refsum; Martin, Charles Patrick; Bjerkestrand, Kari Anne Vadstensvik & Johnson, Victoria
(2017).
Sverm-Muscle.
Jensenius, Alexander Refsum; Martin, Charles Patrick; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Johnson, Victoria
(2017).
Sverm-Resonans.
Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick
(2017).
Sverm-Puls.
Vis sammendrag
An installation that gives you access to heightened sensations of stillness, sound and vibration.
Approach one of the guitars. Place yourself in front of it and stand still. Feel free to put your hands on the body of the instrument. Listen to the sounds appearing from the instrument. As opposed to a traditional instrument, these guitars are “played” by (you) trying to stand still. The living body interacts with an electronic sound system played through the acoustic instrument. In this way, Sverm-Puls explores the meeting points between the tactile and the kinesthetic, the body and the mind, and between motion and sound.
Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick
(2017).
Sverm-Resonans.
Vis sammendrag
En installasjon som gir deg tilgang til stillstand, lyd og vibrasjon.
Stå stille. Lytt. Finn lyden. Beveg deg. Stå stille. Lytt. Hør spenningen. Kjenn på bevegelsene dine. Slapp av. Stå enda stillere. Lytt dypere. Føl på grensen mellom det kjente og det ukjente, det kontrollerbare og det ukontrollerbare. Hvordan møter kroppen lyden? Hvordan møter lyden kroppen? Hva hører du?
Gå bort til en av gitarene. Plasser deg fordan den og kjenn på stillstanden din. Hvis du vil kan du plassere hendene på instrumentet. Forsøk å lukke øynene. Åpne sansene for lydvibrasjonene du føler og hører. Stå så lenge du vil og kjenn på utviklingen av lyden, og dine indre opplevelser, bilder og assosiasjoner. I motsetning til et tradisjonelt instrument "spilles" disse gitarene ved at du står stille. Den levende kroppen interagerer med et elektronisk lydsystem spilt gjennom et akustisk instrument. Sverm-resonans utforsker møtepunktet mellom det taktile og det kinesiske, kroppen og sinnet, og mellom bevegelse og lyd.
Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick
(2017).
Sverm-Resonans.
Hvor stille kan du egentlig å stå når du hører på musikk? Det forsøker Alexander Refsum Jensenius å finne svar på.
Jensenius, Alexander Refsum
(2017).
Musikk og bevegelses-laben ved Institutt for musikkvitenskap.
Jensenius, Alexander Refsum
(2017).
Status musikkteknologi ved UiO: forskning og undervisning.
Jensenius, Alexander Refsum
(2017).
Humanities and technology - with a musicological twist.
Jensenius, Alexander Refsum
(2017).
The importance of "nothing": studying human music-related micromotion.
Vis sammendrag
This presentation will focus on my research on human micromotion in musical contexts. My scientific research has focused on understanding more about the phenomenon of human standstill and how music influences our micromotion when standing still. My artistic research has focused on the exploration of micromotion in music and dance performance, and particularly how it is possible to set up systems for sonic microinteraction. My two separate "tracks" of research, the scientific and artistic, have positively reinforced each other, shedding light on a level of musical expressivity on the boundary between the conscious and the unconscious, the voluntary and the involuntary.
Jensenius, Alexander Refsum
(2017).
Micro, Meso, Macro: Music-related body motion at different spatiotemporal levels.
Vis sammendrag
Performance of acoustic instruments is often happening at a spatiotemporal micro-level. Violin performance, for example, is based on an extreme control of the spatial placement of the left-hand fingering and the right-hand bow strokes. Even though there are exceptions, many digital musical instruments (DMIs) are based on meso- or macro-level control, that is, fairly large and slow control actions compared to acoustic instruments. In this talk I will present a theoretical framework for sound-producing actions and a related organological model. This will be exemplified with some of my empirical results of music-induced dancing, "air instrument" performance and sonic microinteraction.
This paper explores sonic microinteraction using muscle sensing through the Myo armband. The first part presents results from a small series of experiments aimed at finding the baseline micromotion and muscle activation data of people
being at rest or performing short/small actions. The second part presents the prototype instrument MicroMyo, built around the concept of making sound with little motion. The instrument plays with the convention that inputting more energy into an instrument results in more sound. MicroMyo, on the other hand, is built so that the less you move, the more it sounds. Our user study shows that while such an "inverse instrument" may seem puzzling at first, it also opens a space for interesting musical interactions.
Music making has moved into the cloud. In this lecture-demonstration, Alexander Refsum Jensenius will show various tools for online music making, ranging from simple sound makers to advanced music programming. He will talk about the possibilities and limitations of various technologies, and propose a framework for understanding how online music making will shape the future of music.
Jensenius, Alexander Refsum & Flydal, Lars O.
(2017).
Leter etter magien i musikken.
[Avis].
Vårt Land.
Vis sammendrag
Alexander Refsum Jensenius leter etter det magiske i musikken. Hans drøm er å hente ut kroppens egen musikk.
Jensenius will talk about the unlikely story of how his basic music research has led to medical innovation. In 2005 he developed a method for visualizing the movements of dancers – motiongrams – with a set of accompanying software tools. Now this method is at the core of CIMA – Computer-based Infant Movement Assessment – a clinical system currently being tested in hospitals around the world, with the aim of detecting early-born infants' risk of developing cerebral palsy.
Jensenius, Alexander Refsum & Kelkar, Tejaswinee
(2017).
Improvisation for Linnstrument, voice and Mogees.
In order to understand how we perceive music, we should always consider that this process starts in our bodies being exposed to sound. Thus, the cognitive processing of sound cannot be treated as separate from the functioning of the body. During the last decades there has been a growing focus on embodied music cognition, and the role of the human body in both perception and production of music has been widely studied. However, not much research has been devoted to what we might call “micromotion” of the human body in the musical context – the tiniest, often involuntary and unconscious movements that occur during music listening. The human body is never completely still as there are many physiological processes that induce small-scale movement. The question, then, is whether such micromotion is altered when we are exposed to music? Here I would like to present my design of a set of small experiments that will shed some light on how human micromotion is influenced by musical sound. Methodologically, I will use a combination of motion capture and portable eye-tracking glasses. Each experiment will be designed to test a different subject, such as the differences between listening to music via headphones and via loudspeakers or the importance of the low-frequency region (bass sound) in inducing body movement. The aim of these small and exploratory studies will be to understand more about the effects of musical sound on human micromotion in particular and music cognition in general.
Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick
(2017).
Sverm-resonans.
Vis sammendrag
An installation that gives you access to heightened sensations of stillness, sound and vibration. Unlike traditional instruments these guitars are “played” by (you) trying to stand still. The living body interacts with an electronic sound system played through the acoustic instrument. In this way, Sverm-Resonans explores the meeting points between the tactile and the kinesthetic, the body and the mind, and between motion and sound.
The presentation will reflect on the installation piece "Sverm-resonans".
As opposed to a traditional instrument, these guitars are “played” by trying to stand still. The living body interacts with an electronic sound system played through the acoustic instrument. In this way, Sverm-Resonans explores the meeting points between the tactile and the kinesthetic, the body and the mind, and between motion and sound.
Jensenius, Alexander Refsum; Bjerkestrand, Kari Anne Vadstensvik; Johnson, Victoria; Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Martin, Charles Patrick
(2017).
Sverm-Resonans.
Vi lever i en spennende tid, med stadig nye varianter av både akustiske og elektroniske instrumenter. Disse passer sjelden inn i de tradisjonelle organologiske fremstillingene, noe som gjør at det er behov for en mer systematisk diskusjon av hvordan man kan klassifisere både instrumenter (i utvidet forstand) og dere spilleteknikk. I denne presentasjonen vil jeg forklare hovedelementene i en ny organologi som jeg holder på å utvikle, med utgangspunkt i det jeg kaller "handling-lyd-koblinger".
Gonzalez Sanchez, Victor Evaristo; Zelechowska, Agata & Jensenius, Alexander Refsum
(2017).
The influence of music in people's standstill.
Hvorfor beveger du deg til musikk? Hvorfor tramper du takten forsiktig på jazzkonsert, men danser vilt og uhemmet på en mørk og svett klubb? Og, hvor stille sitter du egentlig på en klassisk konsert? Musikkforsker Alexander Refsum Jensenius vil svare på disse spørsmålene og fortelle mer om sin forskning på luftgitarister, discodansere, pianister og dirigenter.
– Folk som kallar seg tonedøve og umusikalske spelar utmerka luftpiano, seier musikkforskar Alexander Refsum Jensenius. Over 4000 personar vil lære kva han har å seie om musikk og rørsle.
Jensenius, Alexander Refsum
(2016).
Musikk, dansefot og gåsehud.
[Radio].
NRK P1 Kveldsåpent.
This paper reports on a study of how music influences people standing still, with the aim to investigate: (a) How (much) people move when standing still in silence? (b) How (much) musical sound influences people's standstill?
A total of 103 participants (mean age 25 years, equal gender balance) were recruited to the study, which was presented as a "championship" of standstill. The participants each wore a reflective marker on their head, and its position was recorded using a state-of-the-art motion capture system. The task was to stand still on the floor for 6 minutes, 3 minutes in silence and 3 minutes with music. The musical stimuli were 7 excerpts of 20-40 seconds duration, ranging from slow, non-rhythmic music for the first excerpts to dance music at the end. After omitting subjects with with incomplete data sets, 91 participants have been included in the study.
The analysis shows that the average quantity of motion (QoM), calculated as the cumulative distance travelled for each of the head markers, was 6.5 mm/s (std: 1.6 mm/s) for the entire data set. For the 3-minute parts without music we found an average QoM of 6.3 mm/s (std 1.4 mm/s), as opposed to 6,6 mm/s (std 2,2 mm/s) for the part with music. The results are even more clear when looking at individual stimuli, with a QoM of 6,8 mm/s (std 2,3 mm/s) for the last dance music example. Contrary to our expectations, no particular spatiotemporal differences were found in the motion patterns for different musical excerpts.
The study confirmed the level of micromotion found in human standstill (6.5 mm/s QoM) found in our previous longitudinal studies. The study also confirmed our expectation that people spontaneously move to music, even when specifically trying to stand still. The study did not, however, confirm any spatiotemporal differences for the different musical material. Future studies will include looking systematically at how musical features influence both the quantity and quality of standstill.
Hvorfor beveger du deg til musikk? Hva er det med dansemusikk som får deg til å danse? Hvor stille sitter du egentlig på en klassisk konsert? Alexander Refsum Jensenius vil svare på disse spørsmålene og fortelle mer om sin forskning på luftgitarister, discodansere, pianister og dirigenter. Utgangspunktet for forskningen er erkjennelsen av at musikkopplevelsen i bunn og grunn er kroppslig fundert. Vi opplever ikke musikk bare med ørene, alle sansene er involvert når vi "lytter". Ved å bruke bevegelsessporing som metode, er det mulig å systematisk studere hvordan folk responderer på musikk. Resultatet er ny kunnskap om musikalsk mening, fra et kroppslig perspektiv.
Jensenius, Alexander Refsum
(2016).
Exploring Music-related Micromotion in the Artistic-Scientific Research Project Sverm.
Vis sammendrag
This presentation will focus on my work on human micromotion in musical contexts. My scientific research has focused on understanding more about the phenomenon of human standstill and how music can influence our micromotion when standing still. My artistic research has focused on the exploration of micromotion in music and dance performance, and particularly how it is possible to set up systems for artistic microinteraction. Most importantly, even though it makes sense to talk about these as two separate "tracks" of research, the scientific and artistic, they have in fact been highly intertwined. It would not have been possible to achieve neither the scientific nor artistic outcomes without a true artistic-scientific research process.
Jensenius, Alexander Refsum
(2016).
Åpen forskning - et humanistisk-teknologisk perspektiv.
Jensenius, Alexander Refsum & Duch, Michael Francis
(2016).
Edges.
The goal of this thesis was to develop and experiment with a set of sonification tools to explore participant data from standstill competitions. Using data from the 2012 Norwegian Championship of Standstill, three sonification models were developed using the Max/MSP programming environment. The first section of the thesis introduces sonification as a method for data exploration and discusses different sonification strategies. Momentary Displacement of the position was derived from the position data and parameter mapping methods were used to map the data features with sound parameters. The displacement of position in the XY plane or the position changes along the Z-Axis can be mapped either to white-noise or to a sine tone. The data variables control the amplitude and a filter cut-off frequency of the white noise or the amplitude and frequency of the sine tone. Moreover, using sound spatialization together with sonification was explored by mapping position coordinates to spatial parameters of a sine tone. A “falling” effect of the standing posture was identified through the sonification. Also audible were the participants’ breathing patterns and postural adjustments. All in all, the implemented sonification methods can be effectively used to get an overview of the standstill dataset.
Zelechowska, Agata; Jensenius, Alexander Refsum; Laeng, Bruno & Vuoskoski, Jonna Katariina
(2020).
Irresistible Movement: The Role of Musical Sound, Individual Differences and Listening Context in Movement Responses to Music.
Universitetet i Oslo.
Fulltekst i vitenarkiv
Today, there are several toolboxes which can work on audio, motion, or other sensor data. These toolboxes are very useful to provide characteristic analysis of audio and motion. Unfortunately, the analysis is done separately by different toolboxes. This results in inconvenience when we want to work on these data simultaneously. So developing a toolbox which integrates the existing toolboxes is necessary. The main goal of the project is to integrate these toolboxes in Matlab and provide video analysis combined with audio and motion capture data. This would be important for our interdisciplinary research on music and motions through fourMs as well as for external work on e.g. analyzing video recording for early child diagnosis of cerebral palsy. This project presents the development of a toolbox for Matlab entitled “Musical Gestures (MG) Toolbox”. This toolbox is aimed for solving pressing needs for the video analysis of music-related body motion since video source recorded by regular video camera is a very good option for studying motion. The term music-related body motion refers to all sorts of body motion found in music performance and perception. It has received a growing interest in music research and behavioral science over the last decades. Particularly, with the rapid development of modern technology, various motion capture systems make it possible to further study music-related body motion. Matlab has been chosen as the platform since it is readily available, and there are already several pre-existing toolboxes to build on. This includes the “Motion Capture (MoCap) Toolbox” [1] developed for the analysis and visualization of Motion Capture data, which is aimed specifically for the analysis of music-related body motion. The “Music Information Retrievel (MIR) Toolbox” [2] is another relevant toolbox, which is developed for the extraction of musical features from audio data and the investigation of relationships between sound and music features. While the two above mentioned toolboxes are useful for studying motion capture data and audio, respectively, they are very differently designed, and it is not possible to make combined analysis of audio and motion capture data. Furthermore, there is no integration with video analysis. The MG Max toolbox [3] has been developed for music-related video analysis in the graphical programming environment Max/Msp/Jitter, with a number of novel visualization techniques (motiongrams, motion history images, etc.). These techniques are commonly used in music research, but are not currently available in Matlab. The main contributions of this project consist of two following things. One is to integrate the MoCap toolbox and MIR toolbox, and provide simple preprocessing on different input data. Another is to provide several video analysis techniques to study music-related body motion in the toolbox. These video analysis techniques include motiongram, optical flow, eulerian video magnification. With these techniques, the developed MG toolbox for Matlab could provide reliable and quantitative analysis of music-related body motion based on video.