Hands-Free Speech Interaction With Artificial Companions - IMT - Institut Mines-Télécom Accéder directement au contenu
Communication Dans Un Congrès Année : 2011

Hands-Free Speech Interaction With Artificial Companions

Résumé

The automatic recognition of speech for human-machine interaction is out of doubt one of the most interesting interfaces to develop. This work aims to provide an expanded vision of what has been experimented, the current challenges and the planned research towards the “perfect” dialogue environment of an Ambient Assisted Living structure. The FP7 CompanionAble project offers the context and support of this research. Either a Smart Home and a Robot Companion environment are involved in the automatic speech recognition behaviors. The target speakers are mainly elderly persons living alone, considered care recipients. Maximum a Posteriori adaptation to the speaker and channel increases word recognition with 1-best hypothesis up to 87.34% for elderly. Maximum Likelihood Linear Regression (MLLR) has reached 94.44% semantically correct sentences for an adult speaker. Moreover, the integration and synchronisation with external sound modules are discussed to clarify the main aspects of the interactive architecture.
Fichier non déposé

Dates et versions

hal-00615464 , version 1 (19-08-2011)

Identifiants

  • HAL Id : hal-00615464 , version 1

Citer

Daniel R.S. Caon, Jérôme Boudy, Gérard Chollet. Hands-Free Speech Interaction With Artificial Companions. 5th Companion Robotics Institute Workshop, 2011, Brussels, Belgium. ⟨hal-00615464⟩
97 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More