Skip to Main content Skip to Navigation

Expressive gesture model

Abstract : This thesis presents a computational model to generate expressive communicative gestures accompanying speech for a humanoid agent. Our research work is based on studies of human communicative gestures. Three research issues are addressed in this work. First, human gestures are encoded and reproduced in such a way that these gestures are realizable for different humanoid agents. A set of properties of a gesture such as hand shape, wrist position, movement trajectory, etc is used in order to encode gestures. Second, gestures are planned to be synchronized with speech. The model relies on the relationship between gestures and speech to schedule gesture timing. Third, these gestures are rendered expressively. Depending on the personality or the current emotional states of the agent, its gestures are varied by modulating a set of gesture expressivity dimensions. The model was designed in such a way that its processes are as much as possible independent of an agent’s embodiment. So far, this model has been used to control gestures of the Greta virtual agents and the Nao physical humanoid robot.
Keywords : Greta Nao
Document type :
Complete list of metadata

Cited literature [4 references]  Display  Hide  Download
Contributor : ABES STAR :  Contact
Submitted on : Tuesday, July 28, 2015 - 4:57:06 PM
Last modification on : Friday, July 31, 2020 - 10:44:08 AM
Long-term archiving on: : Thursday, October 29, 2015 - 11:04:42 AM


Version validated by the jury (STAR)


  • HAL Id : tel-01181000, version 1



Quôc Anh Lê. Expressive gesture model. Robotics [cs.RO]. Télécom ParisTech, 2013. English. ⟨NNT : 2013ENST0036⟩. ⟨tel-01181000⟩



Record views


Files downloads