Animating a conversational agent with user expressivity

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Our objective is to animate an embodied conversational agent (ECA) with communicative gestures rendered with the expressivity of a real human user it represents. We describe an approach to estimate a subset of expressivity parameters defined in the literature (namely spatial and temporal extent) from captured motion trajectories. We first validate this estimation against synthesis motion and then show results with real human motion. The estimated expressivity is then sent to the animation engine of an ECA that becomes a personalized autonomous representative of that user.

Original languageEnglish
Title of host publicationIntelligent Virtual Agents - 11th International Conference, IVA 2011, Proceedings
Pages464-465
Number of pages2
DOIs
Publication statusPublished - 30 Sept 2011
Externally publishedYes
Event11th International Conference on Intelligent Virtual Agents, IVA 2011 - Reykjavik, Iceland
Duration: 15 Sept 201117 Sept 2011

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume6895 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th International Conference on Intelligent Virtual Agents, IVA 2011
Country/TerritoryIceland
CityReykjavik
Period15/09/1117/09/11

Fingerprint

Dive into the research topics of 'Animating a conversational agent with user expressivity'. Together they form a unique fingerprint.

Cite this