Generating co-speech gestures for the humanoid robot NAO through BML
Abstract
We develop an expressive gesture model based on GRETA platform to generate gestures accompanying speech for different embodiments. This paper presents our ongoing work on an implementation of this model on the humanoid robot NAO. From a specification of multimodal behaviors encoded with the behavior markup language, BML, the system synchronizes and realizes the verbal and nonverbal behaviors on the robot.