Prosody-based adaptive metaphoric head and arm gestures synthesis in human robot interaction

Research output: Contribution to conferencePaperpeer-review

Abstract

In human-human interaction, the process of communication can be established through three modalities: verbal, non-verbal (i.e., gestures), and/or para-verbal (i.e., prosody). The linguistic literature shows that the para-verbal and non-verbal cues are naturally aligned and synchronized, however the natural mechanism of this synchronization is still unexplored. The difficulty encountered during the coordination between prosody and metaphoric head-arm gestures concerns the conveyed meaning, the way of performing gestures with respect to prosodic characteristics, their relative temporal arrangement, and their coordinated organization in the phrasal structure of utterance. In this research, we focus on the mechanism of mapping between head-arm gestures and speech prosodic characteristics in order to generate an adaptive robot behavior to the interacting human's emotional state. Prosody patterns and the motion curves of head-arm gestures are aligned separately into parallel Hidden Markov Models (HMM). The mapping between speech and head-arm gestures is based on the Coupled Hidden Markov Models (CHMM), which could be seen as a multi-stream collection of HMM, characterizing the segmented prosody and head-arm gestures' data. An emotional state based audio-video database has been created for the validation of this study. The obtained results show the effectiveness of the proposed methodology.

Original languageEnglish
DOIs
Publication statusPublished - 1 Jan 2013
Event2013 16th International Conference on Advanced Robotics, ICAR 2013 - Montevideo, Uruguay
Duration: 25 Nov 201329 Nov 2013

Conference

Conference2013 16th International Conference on Advanced Robotics, ICAR 2013
Country/TerritoryUruguay
CityMontevideo
Period25/11/1329/11/13

Fingerprint

Dive into the research topics of 'Prosody-based adaptive metaphoric head and arm gestures synthesis in human robot interaction'. Together they form a unique fingerprint.

Cite this