TY - GEN
T1 - Human Gesture Recognition with a Flow-based Model for Human Robot Interaction
AU - Liu, Lanmiao
AU - Yu, Chuang
AU - Song, Siyang
AU - Su, Zhidong
AU - Tapus, Adriana
N1 - Publisher Copyright:
© 2023 IEEE Computer Society. All rights reserved.
PY - 2023/3/13
Y1 - 2023/3/13
N2 - Human skeleton-based gesture classification plays a dominant role in social robotics. Learning the variety of human skeleton-based gestures can help the robot to continuously interact in an appropriate manner in a natural human-robot interaction (HRI). In this paper, we proposed a Flow-based model to classify human gesture actions with skeletal data. Instead of inferring new human skeleton actions from noisy data using a retrained model, our end-to-end model can expand the diversity of labels for gesture recognition from noisy data without retraining the model. At first, our model focuses on detecting five human gesture actions (i.e., come on, right up, left up, hug, and noise-random action). The accuracy of our online human gesture recognition system is as well as the offline one. Meanwhile, both attain 100% accuracy among the first four actions. Our proposed method is more efficient for inference of new human gesture action without retraining, which acquires about 90% accuracy for noise-random action. The gesture recognition system has been applied to the robot's reaction toward the human gesture, which is promising to facilitate a natural human-robot interaction.
AB - Human skeleton-based gesture classification plays a dominant role in social robotics. Learning the variety of human skeleton-based gestures can help the robot to continuously interact in an appropriate manner in a natural human-robot interaction (HRI). In this paper, we proposed a Flow-based model to classify human gesture actions with skeletal data. Instead of inferring new human skeleton actions from noisy data using a retrained model, our end-to-end model can expand the diversity of labels for gesture recognition from noisy data without retraining the model. At first, our model focuses on detecting five human gesture actions (i.e., come on, right up, left up, hug, and noise-random action). The accuracy of our online human gesture recognition system is as well as the offline one. Meanwhile, both attain 100% accuracy among the first four actions. Our proposed method is more efficient for inference of new human gesture action without retraining, which acquires about 90% accuracy for noise-random action. The gesture recognition system has been applied to the robot's reaction toward the human gesture, which is promising to facilitate a natural human-robot interaction.
KW - Flow-based model
KW - Gestures recognition
KW - Social Robot
U2 - 10.1145/3568294.3580145
DO - 10.1145/3568294.3580145
M3 - Conference contribution
AN - SCOPUS:85150419103
T3 - ACM/IEEE International Conference on Human-Robot Interaction
SP - 548
EP - 551
BT - HRI 2023 - Companion of the ACM/IEEE International Conference on Human-Robot Interaction
PB - IEEE Computer Society
T2 - 18th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2023
Y2 - 13 March 2023 through 16 March 2023
ER -