Skip to main content Accessibility help

Strategies of mapping between gesture data and synthesis model parameters using perceptual spaces

  • D. Arfib (a1), J. M. Couturier (a1), L. Kessous (a1) and V. Verfaille (a1)


This paper is about mapping strategies between gesture data and synthesis model parameters by means of perceptual spaces. We define three layers in the mapping chain: from gesture data to gesture perceptual space, from sound perceptual space to synthesis model parameters, and between the two perceptual spaces. This approach makes the implementation highly modular. Both perceptual spaces are developed and depicted with their features. To get a simple mapping between the gesture perceptual subspace and the sound perceptual subspace, we need to focus our attention on the two other mappings. We explain the mapping types: explicit/implicit, static/dynamic. We also present the technical and aesthetical limits introduced by mapping. Some practical examples are given of the use of perceptual spaces in experiments done at LMA in a musical context. Finally, we discuss several implications of the mapping strategies: the influence of chosen mapping limits onto performers' virtuosity, and the incidence of mapping on the learning process with virtual instruments and on improvisation possibilities.


Recommend this journal

Email your librarian or administrator to recommend adding this journal to your organisation's collection.

Organised Sound
  • ISSN: 1355-7718
  • EISSN: 1469-8153
  • URL: /core/journals/organised-sound
Please enter your name
Please enter a valid email address
Who would you like to send this to? *


Full text views

Total number of HTML views: 0
Total number of PDF views: 0 *
Loading metrics...

Abstract views

Total abstract views: 0 *
Loading metrics...

* Views captured on Cambridge Core between <date>. This data will be updated every 24 hours.

Usage data cannot currently be displayed