Food & Paper: Multifractal Analysis: Quantifying the dynamics of adaptation in technology use (Daniel Bennett, University of Bristol)
This week's Food & Paper will be given by Daniel Bennett from University of Bristol on multifractal analysis.
When a user comes to interact with technology, do they arrive with a goal, and some kind of plan, and then use the technology to more-or-less efficiently execute that plan? Alternatively, could interactions be more "interactive" than this? Do they involve constant adaptation and adjustment, and do users' intentions emerge from the process of interaction as often as they pre-exist it?
This latter, 'adaptive' view has been influential in HCI for around 30 years. It seems particularly well suited to certain kinds of interaction where we are aware of a continuous moment-to-moment back and forth engagement, and even synchronisation with the technology - for example physically skillful interactions, creativity, gaming and training. But if this kind of adaptation is really key to interaction, can we observe it, even measure and quantify it? To date researchers looking at technology use have few methodologies to achieve this.
Multifractal analysis of behaviour seems to offer one way forward. This technique grew out of statistical physics, and has since been applied widely in cognitive science and human movement science, where it is used to understand the role of adaptation and dynamics in high-level behaviour. My PhD research applies these techniques to understand interaction with technology. I find that multifractal patterns in movement signals during mouse and keyboard use can be used to predict high level adaptive features of behaviour such as engagement with the task, behavioural coupling to the task and technology, and even phenomenological features such as "readiness to hand".
I am a PhD student in Human Computer Interaction, at the University of Bristol. My PhD focuses on quantitative approaches to embodied interaction drawn from recent cognitive science.
My current research builds on recent work in 4E cognitive science, developing methods to infer high level features of user experience and behaviour from signatures in user movement.
I believe a rigorous approach to cognitive embodiment, can help drive forward understanding of interaction. Rather than treating interaction as simply the execution of the user’s action plan, it can help us consider the influence of context, other users, and the feedback relationship between user and technology.
More broadly, I am interested in Complexity Science as an source of interaction-focused methodologies for HCI. I am also a musician, and interested in tools for musical creativity. I have performed at venues around Europe, composed music for theatre and art installations, and I develop and share software for music making, including my Max/MSP library for creativity with rhythmically adaptive Central Pattern Generator neural networks.