TY - GEN
T1 - Interactive Robot Learning for Multimodal Emotion Recognition
AU - Yu, Chuang
AU - Tapus, Adriana
N1 - Publisher Copyright:
© 2019, Springer Nature Switzerland AG.
PY - 2019/1/1
Y1 - 2019/1/1
N2 - Interaction plays a critical role in skills learning for natural communication. In human-robot interaction (HRI), robots can get feedback during the interaction to improve their social abilities. In this context, we propose an interactive robot learning framework using multimodal data from thermal facial images and human gait data for online emotion recognition. We also propose a new decision-level fusion method for the multimodal classification using Random Forest (RF) model. Our hybrid online emotion recognition model focuses on the detection of four human emotions (i.e., neutral, happiness, angry, and sadness). After conducting offline training and testing with the hybrid model, the accuracy of the online emotion recognition system is more than 10% lower than the offline one. In order to improve our system, the human verbal feedback is injected into the robot interactive learning. With the new online emotion recognition system, a 12.5% accuracy increase compared with the online system without interactive robot learning is obtained.
AB - Interaction plays a critical role in skills learning for natural communication. In human-robot interaction (HRI), robots can get feedback during the interaction to improve their social abilities. In this context, we propose an interactive robot learning framework using multimodal data from thermal facial images and human gait data for online emotion recognition. We also propose a new decision-level fusion method for the multimodal classification using Random Forest (RF) model. Our hybrid online emotion recognition model focuses on the detection of four human emotions (i.e., neutral, happiness, angry, and sadness). After conducting offline training and testing with the hybrid model, the accuracy of the online emotion recognition system is more than 10% lower than the offline one. In order to improve our system, the human verbal feedback is injected into the robot interactive learning. With the new online emotion recognition system, a 12.5% accuracy increase compared with the online system without interactive robot learning is obtained.
KW - Human-robot interaction
KW - Interactive robot learning
KW - Multimodal emotion recognition
UR - https://www.scopus.com/pages/publications/85076569737
U2 - 10.1007/978-3-030-35888-4_59
DO - 10.1007/978-3-030-35888-4_59
M3 - Conference contribution
AN - SCOPUS:85076569737
SN - 9783030358877
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 633
EP - 642
BT - Social Robotics - 11th International Conference, ICSR 2019, Proceedings
A2 - Salichs, Miguel A.
A2 - Ge, Shuzhi Sam
A2 - Barakova, Emilia Ivanova
A2 - Cabibihan, John-John
A2 - Wagner, Alan R.
A2 - Castro-González, Álvaro
A2 - He, Hongsheng
PB - Springer
T2 - 11th International Conference on Social Robotics, ICSR 2019
Y2 - 26 November 2019 through 29 November 2019
ER -