On Designing Expressive Robot Behavior: The Effect of Affective Cues on Interaction

Amir Aly, Adriana Tapus

Research output: Contribution to journalArticlepeer-review

Abstract

Creating a convincing affective robot behavior is a challenging task. In this paper, we are trying to coordinate between different modalities of communication: speech, facial expressions, and gestures to make the robot interact with human users in an expressive manner. The proposed system employs videos to induce target emotions in the participants so as to start interactive discussions between each participant and the robot around the content of each video. During each experiment of interaction, the expressive ALICE robot generates an adapted multimodal behavior to the affective content of the video, and the participant evaluates its characteristics at the end of the experiment. This study discusses the multimodality of the robot behavior and its positive effect on the clarity of the emotional content of interaction. Moreover, it provides personality and gender-based evaluations of the emotional expressivity of the generated behavior so as to investigate the way it was perceived by the introverted–extroverted and male–female participants within a human–robot interaction context.

Original languageEnglish
Article number314
JournalSN Computer Science
Volume1
Issue number6
DOIs
Publication statusPublished - 1 Nov 2020

Keywords

  • Embodiment of affective robot behavior
  • Facial expressions modelling
  • Gesture synthesis
  • Human perception of the robot behavior
  • Speech synthesis

Fingerprint

Dive into the research topics of 'On Designing Expressive Robot Behavior: The Effect of Affective Cues on Interaction'. Together they form a unique fingerprint.

Cite this