Skip to content →
Header image

Playing Sound Textures

This project focuses on developing a generic system for continuous motion sonification with sound textures. We initially created the system for an interactive installation we presented at SIGGRAPH’14 Studio.

The system implements an approach of Mapping-by-Demonstration that allows novice users to craft gesture control strategies intuitively. The process is simple: listen, move, and the system learns the relationship between your hand gestures and the timbral variations of the sound. Then, you just have to move to explore a sonic environment.

Textures - listen-move

The system uses Gaussian Mixture Regression (GMR) to map between hand movements, captured using the Leap Motion, and descriptor-driven corpus-based concatenative sound synthesis using CataRT by Mubu. During demonstration, the gesture must be performed with a single hand. We record synchronously the streams of motion features and sound descriptors to train a GMR that encodes the mapping. For performance, the mapping can be assigned to each or both hands. The movements can be used to explore the sound textures according to the given relationship, using the GMR that generates the associated trajectories of sound descriptors. Several gestures can be combined, superimposed, or recognized to create a complete sonic environment controlled with the hands.

The system has been demonstrated at SIGGRAPH’14 Studio, with Frédéric Bevilacqua, Riccardo Borghesi and Norbert Schnell. Many thanks to Omid Alemi and Maria Fedorova for their essential help during the conference. Sound corpora were design by Roland Cahen for DIRTI — Dirty Tangible Interface by Matthieu Savary, Florence Massin and Denis Pellerin (User Studio); Diemo Schwarz (Ircam); and Roland Cahen (ENSCI–Les Ateliers).


  • “MaD: Mapping by Demonstration for Continuous Sonification,” ACM SIGGRAPH 2014 Emerging Technologies (SIGGRAPH '14) , Vancouver, BC, Canada, ACM, , pp. 16:1----16:1. DOI: 10.1145/2614066.2614099.
  • “MaD,” interactions, vol. 22, no. 3, , pp. 14--15. DOI: 10.1145/2754894.