, Consumed Endurance: A metric to quantify arm fatigue of mid-air interactions, Proceedings of the 32nd annual ACM conference on Human factors in computing systems -CHI '14, p.62, 2014.

J. K. Aggarwal and L. Xia, Human activity recognition from 3d data: A review, Pattern Recognition Letters, vol.48, p.53, 2014.
DOI : 10.1016/j.patrec.2014.04.011

A. Akl and S. Valaee, Accelerometer-based gesture recognition via dynamic-time warping, affinity propagation, & compressive sensing, 2010 IEEE International Conference on Acoustics, Speech and Signal Processing, p.56, 2010.
DOI : 10.1109/icassp.2010.5495895

S. F. Alaoui, B. Caramiaux, M. Serrano, and F. Bevilacqua, Movement qualities as interaction modality, Proceedings of the Designing Interactive Systems Conference, p.53, 2012.
DOI : 10.1145/2317956.2318071

URL : https://hal.archives-ouvertes.fr/hal-01161433

N. Alessandro, J. Tilmanne, A. Moreau, and A. Puleo, AirPiano : A Multi-Touch Keyboard with Hovering Control, Proceedings of the International Conference on New Interfaces for Musical Expression, p.43, 2015.

D. S. Alexiadis, P. Kelly, P. Daras, N. E. O'connor, T. Boubekeur et al., Evaluating a dancer's performance using kinect-based skeleton tracking, Proceedings of the 19th ACM international conference on Multimedia, p.53, 2011.
DOI : 10.1145/2072298.2072412

A. Altavilla, B. Caramiaux, and A. Tanaka, Towards gestural sonic affordances, Proceedings of the International Conference on New Interfaces for Musical Expression, 2013.

L. Angelino, C. Angelini, and A. Boissière, Quand le geste fait sens. Éditions Mimésis, p.20, 2015.

V. Argyriou, M. Petrou, and S. Barsky, Photometric stereo with an arbitrary number of illuminants, Computer Vision and Image Understanding, vol.114, issue.8, p.53, 2010.

R. Aylward and J. A. Paradiso, Sensemble: a wireless, compact, multi-user sensor system for interactive dance, Proceedings of the 2006 conference on New interfaces for musical expression, p.50, 2006.

J. Balester and C. Pheatt, Using the Xbox Kinect sensor for positional data acquisition, American Journal of Physics, vol.81, issue.1, p.60, 2013.

J. Barbosa, F. Calegario, F. Magalhães, V. Teichrieb, G. Ramalho et al., Towards an evaluation methodology for digital music instruments considering performer's view: a case study, Proceedings of 13th Brazilian Symposium on Computer Music, vol.104, 2011.

N. Barrett, Creating tangible spatial-musical images from physical performance gestures, Proceedings of the International Conference on New Interfaces for Musical Expression, p.26, 2015.

S. Bauer, A. Seitel, H. Hofmann, T. Blum, J. Wasza et al., Real-time range imaging in health care: a survey, Time-of-Flight and Depth Imaging. Sensors, Algorithms, and Applications, p.53, 2013.

S. Baumann, S. Koeneke, C. F. Schmidt, M. Meyer, K. Lutz et al., A network for audio-motor coordination in skilled pianists and non-musicians, Brain research, vol.1161, issue.13, pp.65-78, 2007.

V. Bellotti, M. Back, W. K. Edwards, R. E. Grinter, A. Henderson et al., Making sense of sensing systems: five questions for designers and researchers, Proceedings of the SIGCHI conference on Human factors in computing systems, vol.104, pp.415-422, 2002.

H. Benko and A. Wilson, Depthtouch: Using depth-sensing camera to enable freehand interactions on and above the interactive surface, Proceedings of the IEEE workshop on tabletops and interactive surfaces, vol.8, p.42, 2009.

A. Berthoz, Sens du mouvement (Le), 1997.

F. Bettens and T. Todoroff, Real-time dtw-based gesture recognition external object for max/msp and puredata, Proc. SMC, vol.9, p.56, 2009.

F. Bevilacqua, R. Müller, and N. Schnell, Mnm: a max/msp mapping toolbox, Proceedings of the 2005 conference on New interfaces for musical expression, p.85, 2005.
URL : https://hal.archives-ouvertes.fr/hal-01161330

F. Bevilacqua, N. Rasamimanana, E. Fléty, S. Lemouton, and F. Baschet, The augmented violin project: research, composition and performance report, Proceedings of the 2006 conference on New interfaces for musical expression, p.83, 2006.
URL : https://hal.archives-ouvertes.fr/hal-01161349

F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, and N. Leroy, Wireless sensor interface and gesture-follower for music pedagogy, Proceedings of the 7th international conference on New interfaces for musical expression, vol.57, p.81, 2007.
URL : https://hal.archives-ouvertes.fr/hal-01161378

F. Bevilacqua, B. Zamborlin, A. Sypniewski, N. Schnell, F. Guédy et al., Continuous realtime gesture following and recognition, International Gesture Workshop, vol.57, p.81, 2009.
URL : https://hal.archives-ouvertes.fr/hal-01106955

F. Bevilacqua, N. Schnell, N. Rasamimanana, B. Zamborlin, and F. Guédy, Online gesture analysis and control of audio processing, Musical Robots and Interactive Multimodal Systems, p.80, 2011.
URL : https://hal.archives-ouvertes.fr/hal-01106956

F. Bevilacqua, F. Baschet, and S. Lemouton, The augmented string quartet: experiments and gesture following, Journal of New Music Research, vol.41, issue.1, p.83, 2012.
URL : https://hal.archives-ouvertes.fr/hal-01161437

T. Bianco, V. Freour, N. Rasamimanana, F. Bevilaqua, and R. Caussé, On gestural variation and coarticulation effects in sound control, International Gesture Workshop, p.67, 2009.
URL : https://hal.archives-ouvertes.fr/hal-01161270

J. Blacking, R. Byron, and B. Nettl, Music, culture, and experience: Selected papers of John Blacking, 1995.

A. F. Bobick and J. W. Davis, The recognition of human movement using temporal templates, IEEE Transactions, vol.23, issue.3, p.52, 2001.

A. F. Bobick and A. D. Wilson, A state-based technique for the summarization and recognition of gesture, Computer Vision, 1995. Proceedings., Fifth International Conference on, p.57, 1995.

J. Bowers and P. Archer, Not hyper, not meta, not cyber but infra-instruments, Proceedings of the 2005 conference on New interfaces for musical expression, p.39, 2005.

G. Brelet, L'interprétation Créatrice Essai Sur l'Exécution Musicale. Presses Universitaires de France, p.20, 1951.

D. Brown, C. Nash, and T. Mitchell, Gesturechords: Transparency in gesturally controlled digital musical instruments through iconicity and conceptual metaphor, Proceedings SMC, p.43, 2016.

S. Brown, M. J. Martinez, and L. M. Parsons, The neural basis of human dance, Cerebral cortex, vol.16, issue.8, pp.1157-1167, 2006.

C. Cadoz and M. M. Wanderley, Gesture-music, vol.22, p.60, 2000.
URL : https://hal.archives-ouvertes.fr/hal-01105543

J. Cage, For more new sounds

A. Camurri, G. Volpe, G. D. Poli, and M. Leman, Communicating expressiveness and affect in multimodal interactive systems, Ieee Multimedia, vol.12, issue.1, p.80, 2005.
DOI : 10.1109/mmul.2005.2

B. Caramiaux, J. Françoise, N. Schnell, and F. Bevilacqua, Mapping through listening, Computer Music Journal, vol.38, issue.3, p.84, 2014.
DOI : 10.1162/comj_a_00255

URL : https://hal.archives-ouvertes.fr/hal-01106965

B. Caramiaux, N. Montecchio, A. Tanaka, and F. Bevilacqua, Adaptive gesture recognition with variation estimation for interactive systems, ACM Transactions on Interactive Intelligent Systems (TiiS), vol.4, issue.4, p.57, 2015.
DOI : 10.1145/2643204

URL : https://hal.archives-ouvertes.fr/hal-01266046

S. K. Card, A. Newell, and T. P. Moran, The psychology of human-computer interaction, p.84, 1983.

D. Casasanto and S. Lozano, The meaning of metaphorical gestures. Metaphor and Gesture. Gesture studies, p.24, 2007.

K. Cascone, The aesthetics of failure:"post-digital" tendencies in contemporary computer music, Computer Music Journal, vol.24, issue.4, p.34, 2000.

K. Cascone, Laptop music-counterfeiting aura in the age of infinite reproduction. Parachute, vol.8, p.9, 2002.

G. Castellano, R. Bresin, A. Camurri, and G. Volpe, Expressive control of music and visual media by full-body movement, Proceedings of the 7th international conference on New interfaces for musical expression, p.80, 2007.
DOI : 10.1145/1279740.1279829

G. Castellano, M. Mortillaro, A. Camurri, G. Volpe, and K. Scherer, Automated analysis of body movement in emotionally expressive piano performances. Music Perception, An Interdisciplinary Journal, vol.26, issue.2, p.24, 2008.

C. Chen, R. Jafari, and N. Kehtarnavaz, A survey of depth and inertial sensor fusion for human action recognition, Multimedia Tools and Applications, p.52, 2015.

H. H. Clark, Space, time, semantics, and the child. Cognitive development and the acquisition of language, vol.27, p.19, 1973.

A. Colgan, How does the leap motion controller work? Leap Motion Blog, p.56, 2014.

N. Collins, Generative music and laptop performance, Contemporary Music Review, vol.22, issue.4, pp.67-79, 2003.
DOI : 10.1080/0749446032000156919

. Comscore, The past, present and future of online video, 2013.

, /Insights/Presentations-and-Whitepapers/2013/ The-Past-Present-and-Future-of-Online-Video?

P. Cook, Principles for designing computer music controllers, Proceedings of the 2001 conference on New interfaces for musical expression, vol.81, p.104, 2001.

P. R. Cook, A meta-wind-instrument physical model, and a meta-controller for real-time performance control. International Computer Music Association, p.91, 1992.

P. R. Cook and G. Scavone, The synthesis toolkit (stk), Proceedings of the International Computer Music Conference, p.91, 1999.

H. Cowell, K. H. Wörner, E. Helm, P. Gradenwitz, and C. Sartori, Current chronicle. The Musical Quarterly, vol.38, p.36, 1952.

R. Cowie, E. Douglas-cowie, N. Tsapatsoulis, G. Votsis, S. Kollias et al., Emotion recognition in human-computer interaction, IEEE Signal processing magazine, vol.18, issue.1, p.20, 2001.

M. Csikszentmihalyi, Flow and the psychology of discovery and invention, vol.46, p.133, 1996.

L. A. Custodero, Seeking challenge, finding skill: Flow experience and music education, Arts Education Policy Review, vol.103, issue.3, p.106, 2002.

S. Dahl, F. Bevilacqua, R. Bresin, M. Clayton, L. Leante et al., Gestures in performance. Musical gestures: Sound, movement, and meaning, vol.36, p.20, 2009.

S. Bella and C. Palmer, Rate effects on timing, key velocity, and finger kinematics in piano performance, PloS one, vol.6, issue.6, p.24, 2011.

Y. A. De-kort and W. A. Ijsselsteijn, People, places, and play: player experience in a sociospatial context, Computers in Entertainment (CIE), vol.6, issue.2, p.105, 2008.

F. Delalande, La gestique de gould: éléments pour une sémiologie du geste musical g, vol.87, p.88, 1922.

A. P. Demos, R. Chaffin, K. T. Begosh, J. R. Daniels, and K. L. Marsh, Rocking to the beat: Effects of music and partner's movements on spontaneous interpersonal coordination, Journal of Experimental Psychology: General, vol.141, issue.1, p.20, 2012.

M. Demoucron, A. Askenfelt, and R. E. Causse, Observations on bow changes in violin performance, The Journal of the Acoustical Society of America, vol.123, issue.5, p.51, 2008.

A. Deweppe, M. Leman, and M. Lesaffre, Establishing usability for interactive music applications that use embodied mediation technology, ESCOM 2009 : 7th Triennial Conference of European Society for the Cognitive Sciences of Music, p.105, 2009.

C. Dobrian and D. Koppelman, The'e'in nime: musical expression with new computer interfaces, Proceedings of the 2006 conference on New interfaces for musical expression, vol.48, p.104, 2006.

P. Doornbusch, A brief survey of mapping in algorithmic composition, Proceedings of the International Computer Music Conference, vol.87, 2002.

P. Dourish, Where the action is: the foundations of embodied interaction, vol.14, p.135, 2004.

Z. Eitan and R. Y. Granot, Musical parameters and images of motion, Proceedings of the conference on interdisciplinary musicology (CIM04), p.27, 2004.

Z. Eitan and R. Y. Granot, Music Perception, An Interdisciplinary Journal, vol.23, issue.3, pp.221-248, 2006.

L. Elblaus, K. Hansen, and R. Bresin, Nime design and contemporary music practice: Benefits and challenges, vol.48, p.105, 2014.

S. Emmerson, Computers and Live Electronic Music: Some Solutions, Many Problems, International Computer Music Conference Proceedings, 1991.

J. A. Fails and D. R. Olsen, Interactive machine learning, Proceedings of the 8th international conference on Intelligent user interfaces, vol.58, pp.39-45, 2003.

R. S. Feldman and B. Rimé, Fundamentals of nonverbal behavior, vol.71, p.97, 1991.

R. Fiebrink, D. Trueman, and P. R. Cook, A meta-instrument for interactive, on-the-fly machine learning, NIME, p.82, 2009.

R. Fiebrink, P. R. Cook, and D. Trueman, Human model evaluation in interactive supervised learning, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, p.58, 2011.

J. Françoise, Motion-Sound Mapping by Demonstration, UPMC, vol.57, p.67, 2015.

J. Françoise, B. Caramiaux, and F. Bevilacqua, A hierarchical approach for the design of gesture-to-sound mappings, 9th Sound and Music Computing Conference, p.67, 2012.

J. Françoise, B. Caramiaux, and F. Bevilacqua, A hierarchical approach for the design of gesture-to-sound mappings, 9th Sound and Music Computing Conference, p.82, 2012.

J. Françoise, N. Schnell, R. Borghesi, and F. Bevilacqua, Probabilistic models for designing motion and sound relationships, Proceedings of the 2014 International Conference on New Interfaces for Musical Expression, p.57, 2014.

J. Françoise, F. Bevilacqua, and T. Schiphorst, Gaussbox: Prototyping movement interaction with interactive visualizations of machine learning, Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, vol.58, p.59, 2016.

K. Franinovi´cfraninovi´c and S. Serafin, Sonic interaction design, 2013.

A. Freed, A. Chaudhary, and B. Davila, Operating systems latency measurement and analysis for sound synthesis and processing applications, Proceedings of the 1997 International Computer Music Conference, vol.116, p.119, 1997.

P. Fuchs and G. Moreau, Le traité de la réalité virtuelle, Presses des MINES, vol.2, p.84, 2006.

S. Furuya and H. Kinoshita, Organization of the upper limb movement for piano keydepression differs between expert pianists and novice players, Experimental brain research, vol.185, issue.4, p.24, 2008.

L. Gallo, A. P. Placitelli, and M. Ciampi, Controller-free exploration of medical image data: Experiencing the kinect, Computer-based medical systems (CBMS), 2011 24th international symposium on, p.53, 2011.

G. Garnett and C. Goudeseune, Performance factors in control of high-dimensional spaces, Proceedings of the 1999 International Computer Music Conference, p.81, 1999.

W. W. Gaver, Technology affordances, Proceedings of the SIGCHI conference on Human factors in computing systems, p.25, 1991.
URL : https://hal.archives-ouvertes.fr/hal-00692032

W. W. Gaver, What in the world do we hear?: An ecological approach to auditory event perception, Ecological psychology, vol.5, issue.1, p.25, 1993.

S. Gelineck and S. Serafin, Longitudinal evaluation of the integration of digital musical instruments into existing compositional work processes, Journal of new music research, vol.41, issue.3, pp.259-276, 2012.

R. W. Gibbs, The poetics of mind, p.19, 1994.

S. Gibet, J. Kamp, and F. Poirier, Gesture analysis: Invariant laws in movement, International Gesture Workshop, p.22, 2003.
DOI : 10.1007/978-3-540-24598-8_1

URL : https://hal.archives-ouvertes.fr/hal-01367902

J. J. Gibson, The ecological approach to visual perception: classic edition, vol.25, p.105, 2014.

J. J. Gibson and R. Shaw, Perceiving, acting, and knowing: Toward an ecological psychology. The Theory of Affordances, vol.84, p.135, 1977.

N. Gillian, B. Knapp, and S. O'modhrain, Recognition of multivariate temporal musical gestures using n-dimensional dynamic time warping, NIME, p.56, 2011.

R. I. Godøy, Motor-mimetic music cognition, vol.36, pp.317-319, 2003.

R. I. Godøy, Gestural-sonorous objects: embodied extensions of schaeffer's conceptual apparatus, Organised Sound, vol.11, issue.02, p.26, 2006.

R. I. Godøy, Gestural affordances of musical sound. Musical gestures: Sound, movement, and meaning, vol.25, p.26, 2010.

R. I. Godøy, A. R. Jensenius, and K. Nymoen, Chunking by coarticulation in music-related gestures, 8th International Gesture Workshop, pp.25-27, 2009.

R. I. Godøy, A. R. Jensenius, and K. Nymoen, Chunking in music by coarticulation, Acta Acustica united with Acustica, vol.96, issue.4, p.88, 2010.

R. I. Godøy, A. R. Jensenius, and K. Nymoen, Chunking in music by coarticulation, Acta Acustica united with Acustica, vol.96, issue.4, pp.690-700, 2010.

W. Goebl and C. Palmer, Tactile feedback and timing accuracy in piano performance, Experimental Brain Research, vol.186, issue.3, p.24, 2008.
DOI : 10.1007/s00221-007-1252-1

W. Goebl, R. Bresin, and I. Fujinaga, Perception of touch quality in piano tonesa), The Journal of the Acoustical Society of America, vol.136, issue.5, p.24, 2014.

C. Goudeseune, Interpolated mappings for musical instruments, Organised Sound, vol.7, issue.02, p.85, 2002.
DOI : 10.1017/s1355771802002029

URL : http://zx81.isl.uiuc.edu/camilleg/os02.pdf

R. Graham and B. Bridges, Managing musical complexity with embodied metaphors, Proceedings of the International Conference on New Interfaces for Musical Expression, vol.27, p.88, 2015.

M. Gurevich, Jamspace: a networked real-time collaborative music environment, CHI'06 extended abstracts on Human factors in computing systems, p.26, 2006.

M. Gurevich and A. C. Fyans, Digital musical interactions: Performer-system relationships and their perception by spectators, Organised Sound, vol.16, issue.02, pp.166-175, 2011.
DOI : 10.1017/s1355771811000112

L. J. Hadjileontiadis, Conceptual Blending in Biomusic Composition Space: The "Brainswarm" Paradigm, p.44, 2014.

E. Haga, Correspondences between music and body movement, p.22, 2008.

E. T. Hall, the silent language, 1952.

A. Heloir, N. Courty, S. Gibet, and F. Multon, Temporal alignment of communicative gesture sequences, Computer animation and virtual worlds, vol.17, issue.3-4, p.56, 2006.
URL : https://hal.archives-ouvertes.fr/hal-00493405

C. Henry, Éléments d'Une Théorie Générale de la Dynamogénie, Avec Applications Spéciales aux Sensations Visuelle Et Auditive, vol.17, p.19

O. Hilliges, Interactions in the Air : Adding Further Depth to Interactive Tabletops, p.41, 2009.

D. A. Hodges, Bodily responses to music. The Oxford handbook of music psychology, vol.6, p.13, 2009.

P. Hodgins, Relationships between score and choreography in twentieth-century dance music, movement, and metaphor, p.22, 1992.

M. Hoffman and P. R. Cook, Feature-based synthesis: Mapping acoustic and perceptual features onto synthesis parameters, Proceedings of the 2006 International Computer Music Conference (ICMC), vol.33, p.80, 2006.

T. B. Holmes and T. Holmes, Electronic and experimental music: pioneers in technology and composition, 2002.

A. Hunt and M. M. Wanderley, Mapping performer parameters to synthesis engines, Organised sound, vol.7, issue.02, p.83, 2002.

A. Hunt, M. Wanderley, and R. Kirk, Towards a model for instrumental mapping in expert musical interaction, Proceedings of the 2000 International Computer Music Conference, p.83, 2000.
URL : https://hal.archives-ouvertes.fr/hal-01105532

A. Hunt, M. M. Wanderley, and M. Paradis, The importance of parameter mapping in electronic instrument design, Journal of New Music Research, vol.32, issue.4, pp.429-440, 2003.

D. A. Jaffe and J. O. Smith, Extensions of the karplus-strong plucked-string algorithm, Computer Music Journal, vol.7, issue.2, p.92, 1983.

A. R. Jensenius, Microinteraction in music / dance performance, Proceedings of the International Conference on New Interfaces for Musical Expression, vol.22, p.64, 2015.

A. R. Jensenius, M. M. Wanderley, R. I. Godøy, and M. Leman, Musical gestures: Concepts and methods in research, Musical gestures: Sound, movement, and meaning, vol.21, p.22, 2010.

M. M. Jensenius, A. Refsum, R. I. Godøy, and M. Leman, Musical gestures. Musical gestures: Sound, movement, and meaning, vol.12, p.80, 2009.

M. Jiu, C. Wolf, G. Taylor, and A. Baskurt, Human body part estimation from depth images via spatially-constrained deep learning, Pattern Recognition Letters, vol.50, p.55, 2014.
URL : https://hal.archives-ouvertes.fr/hal-01269994

S. Jordà, Digital instruments and players: part i-efficiency and apprenticeship, Proceedings of the 2004 conference on New interfaces for musical expression, p.48, 2004.

S. Jorda, Digital lutherie crafting musical computers for new musics' performance and improvisation. Department of Information and Communication Technologies, vol.104, 2005.

S. Jorda and S. Mealla, A methodological framework for teaching, evaluating and informing nime design with a focus on expressiveness and mapping, NIME, vol.14, p.124, 2014.

S. Jordà, G. Geiger, M. Alonso, and M. Kaltenbrunner, The reacTable Exploring the Synergy between Live Music Performance and Tabletop Tangible Interfaces. Conference on tangible and embeded interaction, p.86, 2007.

M. Kaltenbrunner, T. Bovermann, R. Bencina, and E. Costanza, Tuio: A protocol for tabletop tangible user interfaces, Proc. of the The 6th Int'l Workshop on Gesture in HumanComputer Interaction and Simulation, p.41, 2005.

M. Karam, A taxonomy of gestures in human computer interactions, p.21, 2005.

M. Karjalainen and U. K. Laine, A model for real-time sound synthesis of guitar on a floating-point signal processor, Acoustics, Speech, and Signal Processing, p.91, 1991.

K. Karplus and A. Strong, Digital synthesis of plucked-string and drum timbres, Computer Music Journal, vol.7, issue.2, p.92, 1983.

M. Keller, D. Lefloch, M. Lambers, S. Izadi, T. Weyrich et al., Real-time 3d reconstruction in dynamic scenes using point-based fusion, 2013 International Conference on 3D Vision-3DV 2013, p.53, 2013.

A. Kendon, Current issues in the study of gesture. The biological foundations of gestures: Motor and semiotic aspects, vol.1, p.67, 1986.

A. Kendon, Gesture: Visible action as utterance, p.22, 2004.

C. Keskin, F. K?raç, Y. E. Kara, and L. Akarun, Hand pose estimation and hand shape classification using multi-layered randomized decision forests, European Conference on Computer Vision, p.55, 2012.

C. Kiefer, Multiparametric interfaces for fine-grained control of digital music, vol.104, 2012.

H. Kinoshita, S. Furuya, T. Aoki, and E. Altenmüller, Loudness control in pianists as exemplified in keystroke force measurements on different touches, The Journal of the Acoustical Society of America, vol.121, issue.5, p.24, 2007.

A. Klaser, M. Marsza?ek, and C. Schmid, A spatio-temporal descriptor based on 3d-gradients, BMVC 2008-19th British Machine Vision Conference, p.52, 2008.
URL : https://hal.archives-ouvertes.fr/inria-00514853

H. Koepchen, R. Droh, R. Spintge, H. Abel, D. Kluessendorf et al., Physiological rhythmicity and music in medicine, MusicMedicine, vol.1, issue.13, pp.39-70, 1992.

V. Krefeld and M. Waisvisz, The hand in the web: An interview with michel waisvisz, Computer music journal, vol.14, issue.2, p.44, 1990.

T. Kulesza, M. Burnett, W. Wong, and S. Stumpf, Principles of explanatory debugging to personalize interactive machine learning, Proceedings of the 20th International Conference on Intelligent User Interfaces, p.58, 2015.

R. Laban, Modern educational dance. revised by

G. Lakoff and M. Johnson, Conceptual metaphor in everyday language, The journal of Philosophy, vol.77, issue.8, p.31, 1980.

R. Lamb and A. Robertson, Seaboard: a new piano keyboard-related interface combining discrete and continuous control, NIME, p.41, 2011.

M. Lamont, Toward a comparative sociology of valuation and evaluation, Annual Review of Sociology, vol.38, p.106, 2012.

I. Laptev, On space-time interest points, International Journal of Computer Vision, vol.64, issue.2-3, p.52, 2005.

B. K. Laurel, Toward the design of a computer-based interactive fantasy system, p.135, 1986.

J. C. Lee, In search of a natural gesture, ACM Crossroads, vol.16, issue.4, p.84, 2010.

A. Leguina, S. Arancibia-carvajal, and P. Widdop, Musical preferences and technologies: Contemporary material and symbolic distinctions criticized, Journal of Consumer Culture, issue.7, p.1469540515586870, 2015.

M. Leman, Embodied music cognition and mediation technology, vol.13, p.80, 2008.

M. Leman and R. I. Godøy, Why study musical gestures. Musical gestures: Sound, movement, and meaning, p.23, 2010.

S. Lepa, A. Hoklas, H. Egermann, and S. Weinzierl, Sound, materiality and embodiment challenges for the concept of 'musical expertise'in the age of digital mediatization, The International Journal of Research into New Media Technologies, vol.7, p.1354856515579837, 2015.

F. Lerdahl and R. Jackendoff, An overview of hierarchical structure in music, An Interdisciplinary Journal, vol.1, issue.2, p.23, 1983.

N. Leroy, E. Fléty, and F. Bevilacqua, Reflective optical pickup for violin, Proceedings of the 2006 conference on New interfaces for musical expression, p.39, 2006.
URL : https://hal.archives-ouvertes.fr/hal-01161364

S. Letz, D. Fober, and Y. Orlarey, Les normes MIDI et MIDIFiles. Hermes, p.32, 2004.
URL : https://hal.archives-ouvertes.fr/hal-02158913

D. J. Levitin, This is your brain on music: Understanding a human obsession, 2011.

J. R. Lewis, Ibm computer usability satisfaction questionnaires: psychometric evaluation and instructions for use, International Journal of Human-Computer Interaction, vol.7, issue.1, p.105, 1995.

L. A. Liikkanen and A. Salovaara, Music on youtube: user engagement with traditional, user-appropriated and derivative videos, Computers in Human Behavior, vol.50, issue.7, pp.108-124, 2015.

Y. Liu and Y. Jia, A robust hand tracking and gesture recognition method for wearable visual interfaces and its applications, Third International Conference on, p.53, 2004.

S. Lourenço, European piano schools: Russian, german and french classical piano interpretation and technique, Journal of Science and Technology of the Arts, vol.2, issue.1, p.24, 2010.

S. Lourenço, M. Clemente, D. Coimbra, Á. Barbosa, and J. Pinho, Do pianists play with their teeth?, International Symposium on Performance Science, p.24, 2009.

H. S. Lusted and R. B. Knapp, Controlling computers with neural signals, Scientific American, vol.275, issue.4, p.44, 1996.

T. Machover and J. Chung, Hyperinstruments: Musically intelligent and interactive performance and creativity systems, p.39, 1989.

T. Magnusson, Affordances and constraints in screen-based musical instruments, Proceedings of the 4th Nordic conference on Human-computer interaction: changing roles, p.26, 2006.

S. Manitsaris, Computer vision for the gesture recognition: gesture analysis and stochastic modelling in music interaction, p.52, 2010.
URL : https://hal.archives-ouvertes.fr/tel-00551004

S. Manitsaris, A. Tsagaris, K. Dimitropoulos, and A. Manitsaris, Finger musical gesture recognition in 3d space without any tangible instrument for performing arts, International Journal of Arts and Technology, vol.8, issue.1, p.56, 2015.
URL : https://hal.archives-ouvertes.fr/hal-01508619

A. Marquez-borbon, M. Gurevich, A. C. Fyans, and P. Stapleton, Designing digital musical, vol.16, p.21, 2011.

T. Marrin and J. Paradiso, The digital baton: a versatile performance instrument, Proceedings of the International Computer Music Conference, p.44, 1997.

K. Mcmillen, M. Wright, D. Simon, and D. Wessel, Zipi-an inexpensive, deterministic, moderate-speed computer network for music and other media, Audio Engineering Society Conference: 13th International Conference, p.86, 1994.

D. Mcneill, Hand and mind: What gestures reveal about thought, vol.18, p.24, 1992.

D. Mcneill, Gesture and thought, vol.19, p.21, 2008.

A. Mcpherson, Touchkeys: Capacitive multi-touch sensing on a physical keyboard, NIME, p.40, 2012.

A. Mcpherson, Buttons, handles, and keys: advances in continuous-control keyboard instruments, Computer Music Journal, p.24, 2015.

M. Merleau-ponty, Phénoménologie de la perception, éditions Gallimard, vol.12, 2013.

C. D. Metcalf, T. A. Irvine, J. L. Sims, Y. L. Wang, A. W. Su et al., Complex hand dexterity: a review of biomechanical methods for measuring musical performance, Frontiers in psychology, vol.5, p.24, 2014.

S. Mukherjee, S. K. Biswas, and D. P. Mukherjee, Recognizing human action at a distance in video by key poses, IEEE Transactions on Circuits and Systems for Video Technology, vol.21, p.54, 2011.
DOI : 10.1109/tcsvt.2011.2135290

A. Mulder, Virtual musical instruments: Accessing the sound synthesis universe as a performer, Proceedings of the First Brazilian Symposium on Computer Music, p.47, 1994.

G. Mumma, Alvin lucier's 'music for solo performer, Music of the Avant-Garde, vol.2, p.43, 1968.

R. Nair, K. Ruhl, F. Lenzen, S. Meister, H. Schäfer et al., A survey on time-of-flight stereo fusion, Time-of-Flight and Depth Imaging. Sensors, Algorithms, and Applications, p.53, 2013.

J. Nakamura and M. Csikszentmihalyi, Flow theory and research. Handbook of positive psychology, vol.83, p.84, 2009.

T. M. Nakra, Inside the Conductor's Jacket: Analysis, interpretation and musical synthesis of expressive gesture, p.45, 1999.

J. Nielsen, Usability inspection methods, Conference companion on Human factors in computing systems, p.105, 1994.

D. A. Norman, The design of everyday things: Revised and expanded edition. Basic books, vol.134, p.135, 2013.

G. Odowichuk, S. Trail, P. Driessen, W. Nie, and W. Page, Sensor fusion: Towards a fully expressive 3d music control interface, Communications, Computers and Signal Processing, p.53, 2011.
DOI : 10.1109/pacrim.2011.6033003

I. Oikonomidis, N. Kyriazis, and A. A. Argyros, Efficient model-based 3d tracking of hand articulations using kinect, BmVC, vol.1, p.55, 2011.
DOI : 10.5244/c.25.101

URL : http://www.bmva.org/bmvc/2011/proceedings/paper101/paper101.pdf

S. O'modhrain, A framework for the evaluation of digital musical instruments, Computer Music Journal, vol.35, issue.1, pp.28-42, 2011.

O. Ortmann, The physical basis of piano touch and tone, 1925.

O. Ortmann, The physiological mechanics of piano technique: An experimental study of the nature of muscular action as used in piano playing and of the effects thereof upon the piano key and the piano tone, p.24, 1929.

H. J. Ottenheimer, The anthropology of language: an introduction to linguistic anthropology. Cengage Learning, p.19, 2008.

C. Palmer, Music performance. Annual review of psychology, vol.48, pp.115-138, 1997.

C. Palmer and P. Pfordresher, From my hand to your ear: the faces of meter in performance and perception, Proceedings of the 6th International Conference on Music Perception and Cognition, p.51, 2000.

S. Papert, Mindstorms: Children, computers, and powerful ideas, p.20, 1980.

J. Paradiso, K. Hsiao, and E. Hu, Interactive music for instrumented dancing shoes, Proc. of the 1999 International Computer Music Conference, p.45, 1999.

J. A. Paradiso, Electronic music: new ways to play, IEEE spectrum, vol.34, issue.12, p.33, 1997.

D. Parlitz, T. Peschel, and E. Altenmüller, Assessment of dynamic finger forces in pianists: effects of training and expertise, Journal of biomechanics, vol.31, issue.11, p.24, 1998.

V. I. Pavlovic, R. Sharma, and T. S. Huang, Visual interpretation of hand gestures for human-computer interaction: A review, IEEE Transactions, vol.19, issue.7, p.54, 1997.

R. Poppe, A survey on vision-based human action recognition, Image and vision computing, vol.28, issue.6, p.54, 2010.

E. R. Post, M. Orth, P. Russo, and N. Gershenfeld, E-broidery: Design and fabrication of textile-based computing, IBM Systems journal, vol.39, issue.3.4, p.45, 2000.

Y. Qiao, F. Mok, and G. Zhou, Methods and apparatus for gesture recognition based on templates, US Patent, vol.6, p.54, 2000.

F. Quek, D. Mcneill, R. Bryll, S. Duncan, X. Ma et al., Multimodal human discourse: gesture and speech, ACM Transactions on Computer-Human Interaction (TOCHI), vol.9, issue.3, p.21, 2002.

F. K. Quek, Toward a vision-based hand gesture interface, Virtual Reality Software and Technology Conference, vol.94, p.21, 1994.

M. Ramanathan, W. Yau, and E. K. Teoh, Human action recognition with video data: research and evaluation challenges, IEEE Transactions on Human-Machine Systems, vol.44, issue.5, p.52, 2014.

J. Rameau, Traité de l'harmonie réduite à ses principes naturels, vol.1722

E. Rank, A player model for MIDI control of synthetic bowed strings. na, p.86, 1999.

N. Rasamimanana, Towards a conceptual framework for exploring and modelling expressive musical gestures, Journal of New Music Research, vol.41, issue.1, p.86, 2012.

N. Rasamimanana and F. Bevilacqua, Effort-based analysis of bowing movements: evidence of anticipation effects, Journal of New Music Research, vol.37, issue.4, p.67, 2008.
URL : https://hal.archives-ouvertes.fr/hal-01161224

N. Rasamimanana, F. Kaiser, F. Bevilacqua, F. Xvii-bibliography-n.-rasamimanana, N. Bevilacqua et al., Perspectives on gesture-sound relationships informed from acoustic instrument studies, Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, vol.14, p.45, 2009.
URL : https://hal.archives-ouvertes.fr/hal-01161036

N. H. Rasamimanana, E. Fléty, and F. Bevilacqua, Gesture analysis of violin bow strokes, International Gesture Workshop, p.39, 2005.
URL : https://hal.archives-ouvertes.fr/hal-01106957

S. S. Rautaray and A. , Vision based hand gesture recognition for human computer interaction: a survey, Artificial Intelligence Review, vol.43, issue.1, p.21, 2015.

S. S. Rautaray and A. , Vision based hand gesture recognition for human computer interaction: a survey, Artificial Intelligence Review, vol.43, issue.1, p.54, 2015.

J. Rink, The practice of performance: Studies in musical interpretation, p.24, 2005.

M. Ritter and A. Aska, Leap Motion As Expressive Gestural Interface, p.43, 2014.

D. Rocchesso and F. Fontana, The sounding object. Mondo estremo, vol.25, p.30, 2003.

J. B. Rovan, M. M. Wanderley, S. Dubnov, and P. Depalle, Instrumental gestural mapping strategies as expressivity determinants in computer music performance, Proceedings of the AIMI International Workshop, vol.81, p.83, 1997.
URL : https://hal.archives-ouvertes.fr/hal-01105514

L. Russolo, The art of noises: Futurist manifesto. Audio culture: Readings in modern music, p.33, 2004.

J. Ryan, Some remarks on musical instrument design at steim. Contemporary music review, vol.6, p.81, 1991.

T. D. Sanger, Bayesian filtering of myoelectric signals, Journal of neurophysiology, vol.97, issue.2, p.89, 2007.

J. C. Schacher, Investigating gestural electronic music, 2014.

P. Schaeffer, Traité des objets musicaux, Le Seuil, vol.24, p.88, 1966.

R. M. Schafer, The soundscape: Our sonic environment and the tuning of the world

J. Schell, The Art of Game Design: A book of lenses, p.47, 2014.

A. Schutz, The phenomenology of the social world, p.28, 1967.

D. Schwarz, Concatenative sound synthesis: The early years, Journal of New Music Research, vol.35, issue.1, p.96, 2006.
URL : https://hal.archives-ouvertes.fr/hal-01161361

D. Schwarz, R. Cahen, and S. Britton, Principles and applications of interactive corpusbased concatenative synthesis, Journées d'Informatique Musicale (JIM), p.96, 2008.
URL : https://hal.archives-ouvertes.fr/hal-01161401

L. A. Schwarz, A. Mkhitaryan, D. Mateus, and N. Navab, Human skeleton tracking from depth data using geodesic distances and optical flow, Image and Vision Computing, vol.30, issue.3, p.55, 2012.
URL : https://hal.archives-ouvertes.fr/hal-01692292

S. Sentürk, S. W. Lee, A. Sastry, A. Daruwalla, and G. Weinberg, Crossole: A gestural interface for composition, improvisation and performance using kinect, NIME, p.53, 2012.

W. A. Sethares, A. J. Milne, S. Tiedje, A. Prechtl, and J. Plamondon, Spectral tools for dynamic tonality and audio morphing, Computer Music Journal, vol.33, issue.2, p.98, 2009.

J. Shotton, T. Sharp, A. Kipman, A. Fitzgibbon, M. Finocchio et al., Real-time human pose recognition in parts from single depth images, Communications of the ACM, vol.56, issue.1, p.55, 2013.

V. Teichrieb and G. L. Ramalho, A preliminary evaluation of the leap motion sensor as controller of new digital musical instruments, 2013.

C. Small, Musicking: The meanings of performing and listening, vol.6, 2011.

D. Smalley, Spectro-morphology and structuring processes, The language of electroacoustic music, p.29, 1986.

D. W. Smith, Phenomenology. Encyclopedia of Cognitive Science, p.12, 2008.

J. O. Smith, Making virtual electric guitars and associated effects using faust, p.90, 2007.

J. Snyder, Snyderphonics manta controller, a novel usb touch-controller, NIME, p.40, 2011.

J. Snyder and A. Mcpherson, The jd-1: an implementation of a hybrid keyboard/sequencer controller for analog synthesizers, NIME, vol.40, p.48, 2012.

L. Sonami, Lady's glove. Ars Electronica, p.44, 1991.

V. Stiefel, D. Trueman, and P. Cook, Re-coupling: the ublotar synthesis instrument and the showl speaker-feedback controller, Proceedings of the International Computer Music Conference, p.92, 2004.

D. Stowell, M. D. Plumbley, and N. Bryan-kinns, Discourse analysis evaluation method for expressive musical interfaces, NIME, p.105, 2008.

C. Stuart, The object of performance: Aural performativity in contemporary laptop music, Contemporary Music Review, vol.22, issue.4, p.39, 2003.

C. R. Sullivan, Extending the karplus-strong algorithm to synthesize electric guitar timbres with distortion and feedback, Computer Music Journal, vol.14, issue.3, p.91, 1990.

E. Sweetser, From etymology to pragmatics: The mind-body metaphor in semantic structure and semantic change, Cambridge: CUP, p.19, 1990.

T. Takeda, Y. Hirata, and K. Kosuge, Dance step estimation method based on hmm for dance partner robot, IEEE Transactions on Industrial Electronics, vol.54, issue.2, p.57, 2007.

J. Talbot, B. Lee, A. Kapoor, and D. S. Tan, Ensemblematrix: interactive visualization to support machine learning with multiple classifiers, Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, p.58, 2009.

A. Tanaka, Musical performance practice on sensor-based instruments, Trends in Gestural Control of Music, vol.13, p.88, 2000.

A. Tanaka, Mapping out instruments, affordances, and mobiles. NIME, p.26, 2010.

F. Tecchio, C. Salustri, M. H. Thaut, P. Pasqualetti, and P. Rossini, Conscious and preconscious adaptation to rhythmic auditory stimuli: a magnetoencephalographic study of human brain responses, Experimental brain research, vol.135, issue.2, pp.222-230, 2000.

G. A. Holt, M. J. Reinders, and E. Hendriks, Multi-dimensional dynamic time warping for gesture recognition, Thirteenth annual conference of the Advanced School for Computing and Imaging, vol.300, p.56, 2007.

M. R. Thompson and G. Luck, Exploring relationships between pianists' body movements, their expressive intentions, and structural elements of the music, Musicae Scientiae, vol.16, issue.1, p.24, 2012.

L. Thoresen and A. Hedman, Spectromorphological analysis of sound objects: an adaptation of pierre schaeffer's typomorphology, Organised Sound, vol.12, issue.02, p.29, 2007.

T. Todoroff, Wireless digital/analog sensors for music and dance performances, NIME, p.50, 2011.

K. Tokuda, H. Zen, J. Yamagishi, T. Masuko, S. Sako et al., The hmm-based speech synthesis system (hts, p.57, 2008.

D. Tormoen, F. Thalmann, and G. Mazzola, The Composing Hand: Musical Creation with Leap Motion and the BigBang Rubette, Proceedings of the International Conference on New Interfaces for Musical Expression, pp.207-212, 2014.

B. Truax, Acoustic communication, vol.1, 2001.

D. Trueman, R. L. Dubois, and . Percolate, , p.91, 2002.

D. Van-nort, Modular and Adaptive Control of Sonic Processes, vol.80, p.81, 2010.

D. Van-nort, M. M. Wanderley, and P. Depalle, On the choice of mappings based on geometric properties, Proceedings of the 2004 conference on New interfaces for musical expression, p.82, 2004.

D. Van-nort, M. M. Wanderley, and P. Depalle, Mapping control structures for sound synthesis: Functional and topological perspectives, Computer Music Journal, vol.38, issue.3, p.85, 2014.

L. Vera, J. Gimeno, I. Coma, and M. Fernández, Augmented mirror: interactive augmented reality system based on kinect, IFIP Conference on Human-Computer Interaction, p.53, 2011.
DOI : 10.1007/978-3-642-23768-3_63

URL : https://hal.archives-ouvertes.fr/hal-01597008

C. Villamor, D. Willis, and L. Wroblewski, Touch gesture reference guide. Touch Gesture Reference Guide, p.21, 2010.

C. Volioti, S. Manitsaris, E. Katsouli, and A. Manitsaris, x2gesture: how machines could learn expressive gesture variations of expert musicians, Proceeding of the 16th International Conference on New Interfaces for Musical Expression
URL : https://hal.archives-ouvertes.fr/hal-01509656

G. Volpe, Expressive gesture in performing arts and new media: The present and the future, Journal of New Music Research, vol.34, issue.1, p.80, 2005.

H. and V. Helmholtz, On the Sensations of Tone as a Physiological Basis for the Theory of Music, p.12, 1912.

M. Waisvisz, Gestural round table. STEIM Writings, p.44, 1999.

M. M. Wanderley, Mapping strategies in real-time computer music, Organised Sound, vol.7, issue.2, p.104, 2002.

M. M. Wanderley and P. Depalle, Gestural control of sound synthesis, Proceedings of the IEEE, vol.92, issue.4, p.86, 2004.

M. Wertheimer, Laws of organization in perceptual forms. A source book of Gestalt psychology, p.11, 1938.

D. Wessel and M. Wright, Problems and prospects for intimate musical control of computers, Computer music journal, vol.26, issue.3, p.116, 2002.

M. Wilson and G. Knoblich, The case for motor involvement in perceiving conspecifics, Psychological bulletin, vol.131, issue.3, p.460, 2005.

S. Winges and S. Furuya, Distinct digit kinematics by professional and amateur pianists, Neuroscience, vol.284, p.24, 2015.
DOI : 10.1016/j.neuroscience.2014.10.041

URL : http://europepmc.org/articles/pmc4268118?pdf=render

A. Wright and A. Der-kunsten, The polyphonic touch: coarticulation and polyphonic expression in the performance of piano and organ music, vol.23, p.26, 2016.

M. Wright and A. Freed, Open sound control: A new protocol for communicating with sound synthesizers, Proceedings of the 1997 International Computer Music Conference, vol.2013, p.38, 1997.

X. Xiao and H. Ishii, Mirrorfugue: communicating hand gesture in remote piano collaboration, Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, p.20, 2011.

X. Xiao, P. Puentes, E. Ackermann, and H. Ishii, Andantino: Teaching children piano with projected animated characters, Proceedings of the The 15th International Conference on Interaction Design and Children, p.20, 2016.

B. Zamborlin, F. Bevilacqua, M. Gillies, and M. , Fluid gesture interaction design: Applications of continuous recognition for the design of modern gestural interfaces, ACM Transactions on Interactive Intelligent Systems (TiiS), vol.3, issue.4, p.84, 2014.
URL : https://hal.archives-ouvertes.fr/hal-01572622

J. Zhao, The communicative functions of gestures in l2 speech, Arizona Working Papers in SLAT, vol.13, p.19, 2006.

L. Zhao and N. I. Badler, Synthesis and acquisition of laban movement analysis qualitative parameters for communicative gestures, Citeseer, vol.18, p.21, 2001.

T. G. Zimmerman, J. Lanier, C. Blanchard, S. Bryson, and Y. Harvill, A hand gesture interface device, In ACM SIGCHI Bulletin, vol.18, p.51, 1987.