Modeling, recognition of finger gestures and upper-body movements for musical interaction design

Abstract : This thesis presents a novel musical instrument, named the Embodied Musical Instrument (EMI), which has been designed to answer two problems : how can we capture and model musical gestures and how can we use this model to control sound synthesis parameters expressively. The EMI is articulated around an explicit mapping strategy, which draws inspiration from the piano-playing techniques and other objects’ affordances.  The system we propose makes use of 3D cameras and computer vision algorithms in order to free the gesture from intrusive devices and ease the process of capture and performance, while enabling precise and reactive tracking of the fingertips and upper-body. Having recourse to different 3D cameras tracking solutions, we fully exploit their potential by adding a transparent sheet, which serves as a detection threshold for fingerings as well as bringing a simple but essential haptic feedback. We examined finger movements while tapping on the surface of the EMI and decomposed their trajectories into essential phases, which enabled us to model and analyse piano-like gestures. A preliminary study of generic musical gestures directed our interest not only on the effective gestures operated by the fingers - in the case of keyboard instruments - but also on the accompanying and figurative gestures, which are mostly characterised by the arms and head movements. Consequently, we distinguish two level of interactions, delimited by two bounding volumes. The micro bounding volume includes the micro-gestures operated with the fingers, while the macro bounding volume includes larger movements with the upper-body. Building from this, we extend our piano-like model to a 3D interaction paradigm, where higher-level musical parameters, such as sound effects, can be controlled continuously by upper-body free movements. We explored a set of real-world scenarios for this instrument, namely practice, composition and performance. The EMI introduces a framework for capture and analysis, of specific musical gestures. An off-line analysis of gesture features can reveal trends, faults and musical specificities of an interpret. Several musical works have been created and performed live; either solo or accompanied by a string quartet, revealing the body gesture specificities through the sounds it synthesises. User experience feedback shows that the instrument can be easily taught - if not self-taught - thanks to the intuitive gesture paradigms drawn from piano-like gestures and other metaphorical gestures.
Document type :
Theses
Complete list of metadatas

Cited literature [278 references]  Display  Hide  Download

https://pastel.archives-ouvertes.fr/tel-02119395
Contributor : Abes Star <>
Submitted on : Friday, May 3, 2019 - 5:15:09 PM
Last modification on : Saturday, May 4, 2019 - 1:34:25 AM
Long-term archiving on : Wednesday, October 9, 2019 - 1:05:50 PM

File

2017PSLEM075_archivage.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-02119395, version 1

Citation

Edgar Hemery. Modeling, recognition of finger gestures and upper-body movements for musical interaction design. Automatic. PSL Research University, 2017. English. ⟨NNT : 2017PSLEM075⟩. ⟨tel-02119395⟩

Share

Metrics

Record views

107

Files downloads

28