Hands-Free Speech Interaction With Artificial Companions - Archive ouverte HAL Access content directly
Conference Papers Year :

Hands-Free Speech Interaction With Artificial Companions

Abstract

The automatic recognition of speech for human-machine interaction is out of doubt one of the most interesting interfaces to develop. This work aims to provide an expanded vision of what has been experimented, the current challenges and the planned research towards the “perfect” dialogue environment of an Ambient Assisted Living structure. The FP7 CompanionAble project offers the context and support of this research. Either a Smart Home and a Robot Companion environment are involved in the automatic speech recognition behaviors. The target speakers are mainly elderly persons living alone, considered care recipients. Maximum a Posteriori adaptation to the speaker and channel increases word recognition with 1-best hypothesis up to 87.34% for elderly. Maximum Likelihood Linear Regression (MLLR) has reached 94.44% semantically correct sentences for an adult speaker. Moreover, the integration and synchronisation with external sound modules are discussed to clarify the main aspects of the interactive architecture.
Not file

Dates and versions

hal-00615464 , version 1 (19-08-2011)

Identifiers

  • HAL Id : hal-00615464 , version 1

Cite

Daniel R.S. Caon, Jérôme Boudy, Gérard Chollet. Hands-Free Speech Interaction With Artificial Companions. 5th Companion Robotics Institute Workshop, 2011, Brussels, Belgium. ⟨hal-00615464⟩
93 View
0 Download

Share

Gmail Facebook Twitter LinkedIn More