Food & Paper: Data Driven Analysis of Tiny Touchscreen Performance with MicroJam (Martin)

Previous Postdoctoral Fellow at RITMO Charles Martin will give a talk on "Data Driven Analysis of Tiny Touchscreen Performance with MicroJam"

Image may contain: Outerwear, Hair, Face, Chin, Facial hair.


The widespread adoption of mobile devices, such as smartphones and tablets, has made touchscreens a common interface for musical performance. While new mobile music instrument have been investigated from design and user experience perspectives, there is little examination of the performers' musical outputs. In this work, we introduce a constrained touchscreen performance app, MicroJam, designed to enable collaboration between performers, and engage in a data-driven analysis of more than 1600 performances using the app. MicroJam constrains performances to five seconds, and emphasises frequent and casual music making through a social media-inspired interface. Performers collaborate by replying to performances, adding new musical layers that are played back at the same time. Our analysis shows that users tend to focus on the centre and diagonals of the touchscreen area, and tend to swirl or swipe rather than tap. We also observe that while long swipes dominate the visual appearance of performances, the majority of interactions are short with limited expressive possibilities. Our findings enhance our understanding of how users perform in touchscreen apps and could be applied in future app designs for social musical interaction.



Charles Martin is a specialist in percussion, computer music, and interactive media from Canberra, Australia. He links percussion with electroacoustic music and other media through new technologies. In 2016, Charles joined the Engineering Prediction and Embodied Cognition (EPEC) project at the University of Oslo, where he is developing new ways to predict musical intentions and performances in smartphone apps. 

Published June 22, 2020 1:25 PM - Last modified June 22, 2020 1:25 PM