Musical human-computer interaction
The project investigates aspects of rhythm and motion through the design and construction of interfaces for musical human-computer interaction.
PhD fellows Qichao Lan, Tejaswinee Kelkar and Cagri Erdem are rehearsing on various new interfaces for musical expression in preparation for a MusicLab performance.
- What types of sensors can be used in musical human-computer interaction?
- What types of mappings between actions and sounds work well?
- What are differences between acoustic and electroacoustic sound generation?
The core activity of the project is the investigation of aspects of rhythm and motion through the design and construction of interfaces for musical human-computer interaction. This includes the study and design of both acoustic instruments and completely digital systems. We are particularly interested in various types of electroacoustic instruments, in which we explore the complexity of human motion in musical experience and practice.
We employ a multitude of methods, including theoretical modelling, empirical studies using motion capture and physiological measurements, rapid prototyping, as well as iterative and creative design processes.
This analysis-by-synthesis approach leads to a new understanding of rhythmic phenomena in general, and also lead to various types of artistic and creative results.