The goal of the innovation project SoundTracer is to develop an app for searching in large music libraries through moving a mobile phone in the air.
Photo: Alexander Refsum Jensenius
About the project
Previous research has shown that humans are good at sound tracing, that is, the movement of the body to sonic features while listening to music. This can be rhythmic movements that follow rhythmic patterns in the music, or vertical movements following a melody line. The aim of the this project is the opposite: to use hand movements to make searches in a large sound database at the National Library of Norway.
Today, in order to find a particular piece of music in sound databases, we need to indicate particular terms characterising the music, or show an excerpt of the searched piece, or sing the melody, etc. In this project, the idea is to search in the sound database based on musical gestures drawn by the user. We want to develop tools to extract features from the sound files, and make these features available for searching. The concept is that the user will «draw» a sonic quality in the air with her mobile phone. This can be, for example, a rapid, ascending movement followed by shaking. The features from such a «drawing» will be used to retrieve sound files with similar features. The sound files that most closely matches the search will immediately be played, and the user will be able to navigate through other sound files with similar features by moving the mobile phone in the space.
This innovation project is supported by the University of Oslo and is developed in collaboration with the National Library of Norway. The aim is to present a functional prototype by the end of the Project.
- How is it possible to search in the audio content of a large music Library?
- How is it possible to extract features from the movements of a mobile phone?
- How is it possible to «map» movement features to sound features?
recorded during the workshop at the National Library of Norway on 4 May 2018