Expressive gesture model

Abstract : This thesis presents a computational model to generate expressive communicative gestures accompanying speech for a humanoid agent. Our research work is based on studies of human communicative gestures. Three research issues are addressed in this work. First, human gestures are encoded and reproduced in such a way that these gestures are realizable for different humanoid agents. A set of properties of a gesture such as hand shape, wrist position, movement trajectory, etc is used in order to encode gestures. Second, gestures are planned to be synchronized with speech. The model relies on the relationship between gestures and speech to schedule gesture timing. Third, these gestures are rendered expressively. Depending on the personality or the current emotional states of the agent, its gestures are varied by modulating a set of gesture expressivity dimensions. The model was designed in such a way that its processes are as much as possible independent of an agent’s embodiment. So far, this model has been used to control gestures of the Greta virtual agents and the Nao physical humanoid robot.
Keywords : Greta Nao
Document type :
Theses
Complete list of metadatas

Cited literature [4 references]  Display  Hide  Download

https://pastel.archives-ouvertes.fr/tel-01181000
Contributor : Abes Star <>
Submitted on : Tuesday, July 28, 2015 - 4:57:06 PM
Last modification on : Thursday, October 17, 2019 - 12:36:09 PM
Long-term archiving on : Thursday, October 29, 2015 - 11:04:42 AM

File

theseLeQuocAnhV2.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-01181000, version 1

Collections

Citation

Quôc Anh Lê. Expressive gesture model. Robotics [cs.RO]. Télécom ParisTech, 2013. English. ⟨NNT : 2013ENST0036⟩. ⟨tel-01181000⟩

Share

Metrics

Record views

1094

Files downloads

200