Computational Multimodal Models of Users’ Interactional Trust in Multiparty Human-Robot Interaction

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper, we present multimodal computational models of interactional trust in a humans-robot interaction scenario. We address trust modeling as a binary as well as a multi-class classification problem. We also investigate how early- and late-fusion of modalities impact trust modeling. Our results indicate that early-fusion performs better in both the binary and multi-class formulations, meaning that modalities have co-dependencies when studying trust. We also run a SHapley Additive exPlanation (SHAP) values analysis for a Random Forest in the binary classification problem, as it is the model with the best results, to explore which multimodal features are the most relevant to detect trust or mistrust.

Original languageEnglish
Title of host publicationPattern Recognition, Computer Vision, and Image Processing. ICPR 2022 International Workshops and Challenges - Proceedings
EditorsJean-Jacques Rousseau, Bill Kapralos
PublisherSpringer Science and Business Media Deutschland GmbH
Pages225-239
Number of pages15
ISBN (Print)9783031376597
DOIs
Publication statusPublished - 1 Jan 2023
Event26th International Conference on Pattern Recognition, ICPR 2022 - Montréal, Canada
Duration: 21 Aug 202225 Aug 2022

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume13643 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference26th International Conference on Pattern Recognition, ICPR 2022
Country/TerritoryCanada
CityMontréal
Period21/08/2225/08/22

Keywords

  • HRI
  • affective computing
  • trust

Fingerprint

Dive into the research topics of 'Computational Multimodal Models of Users’ Interactional Trust in Multiparty Human-Robot Interaction'. Together they form a unique fingerprint.

Cite this