TY - GEN
T1 - Multimodal emotion recognition with thermal and rgb-d cameras for human-robot interaction
AU - Yu, Chuang
AU - Tapus, Adriana
N1 - Publisher Copyright:
© 2020 ACM.
PY - 2020/3/23
Y1 - 2020/3/23
N2 - Human emotion detection is an important aspect in social robotics and in human-robot interaction (HRI). In this paper, we propose a vision-based multimodal emotion recognition method based on gait data and facial thermal images designed for social robots. Our method can detect four human emotional states (i.e., neutral, happiness, anger, and sadness). We gathered data from 25 participants in order to build-up an emotion database for training and testing our classification models. We implemented and tested several approaches such as Convolutional Neural Network (CNN), Hidden Markov Model (HMM), Support Vector Machine (SVM), and Random Forest (RF). These were trained and tested in order to compare the emotion recognition ability and to find the best approach. We designed a hybrid model with both the gait and the thermal data and the accuracy of our system shows an improvement of 10% over the other models based on our emotion database. This is a promising approach to be explored in a real-time human-robot interaction scenario.
AB - Human emotion detection is an important aspect in social robotics and in human-robot interaction (HRI). In this paper, we propose a vision-based multimodal emotion recognition method based on gait data and facial thermal images designed for social robots. Our method can detect four human emotional states (i.e., neutral, happiness, anger, and sadness). We gathered data from 25 participants in order to build-up an emotion database for training and testing our classification models. We implemented and tested several approaches such as Convolutional Neural Network (CNN), Hidden Markov Model (HMM), Support Vector Machine (SVM), and Random Forest (RF). These were trained and tested in order to compare the emotion recognition ability and to find the best approach. We designed a hybrid model with both the gait and the thermal data and the accuracy of our system shows an improvement of 10% over the other models based on our emotion database. This is a promising approach to be explored in a real-time human-robot interaction scenario.
KW - Gait
KW - Human-robot interaction
KW - Multimodal emotion recognition
KW - Thermal face
U2 - 10.1145/3371382.3378342
DO - 10.1145/3371382.3378342
M3 - Conference contribution
AN - SCOPUS:85083232127
T3 - ACM/IEEE International Conference on Human-Robot Interaction
SP - 532
EP - 534
BT - HRI 2020 - Companion of the 2020 ACM/IEEE International Conference on Human-Robot Interaction
PB - IEEE Computer Society
T2 - 15th Annual ACM/IEEE International Conference on Human Robot Interaction, HRI 2020
Y2 - 23 March 2020 through 26 March 2020
ER -