Jason Snell

Using Biosensors to Amplify Music Embodiment

When

Thematic Session 5: Mapping and Control (Tuesday, 15:30)

Abstract

Jason Snell will briefly discuss his work with biosensors and biofeedback loops, including his software system that sonifies brainwaves in real-time.

His critique of AI music systems, including one he built in 2014, is that they mimic the output of the human musician without understanding the process of the human musician. Without understanding the process, the AI can only make imitative, cerebral music rather than embodied music with human passion.

One potential solution is have AI systems study the biodata of musicians as they create music and have the AI “body” replicate and incorporate similar data as it generates new music.

Bio

Jason Snell is an interactive technology artist who performs electronic music with biosensors and generative music systems. His performances use biofeedback loops, body movement, and environmental sensors to compose music and visual sequences in real-time. The resulting compositions are biomorphic - taking the shape of life patterns - and reveal a natural intelligence inherent in the world and ourselves. Jason’s presented his work at MIT, NYU, Sundance, CPH:DOX, the Berlin Independent, SF Independent, and Slamdance festivals. After decades of professional experience, Jason is at NYU pursuing a graduate degree in Interactive Media Arts.

Published Oct. 22, 2022 7:40 PM - Last modified Oct. 22, 2022 7:40 PM