TY - GEN
T1 - A Bilingual Social Robot with Sign Language and Natural Language
AU - Hei, Xiaoxuan
AU - Yu, Chuang
AU - Zhang, Heng
AU - Tapus, Adriana
N1 - Publisher Copyright:
© 2024 Copyright held by the owner/author(s)
PY - 2024/3/11
Y1 - 2024/3/11
N2 - In situations where both deaf and non-deaf individuals are present in a public setting, it would be advantageous for a robot to communicate using both sign and natural languages simultaneously. This would not only address the needs for diverse users but also pave the way for a richer and more inclusive spectrum of human-robot interactions. To achieve this, a framework for a bilingual robot has been proposed in this paper. The robot exhibits the ability to articulate messages in spoken language, complemented by non-verbal cues such as expressive gestures, all while concurrently conveying information through sign language. The system can generate natural language expressions with speech audio, spontaneous prosody-based gestures, and sign language displayed on a virtual avatar on a robot's screen. The preliminary findings from this research showcase the robot's capacity to seamlessly blend natural language expressions with synchronized gestures and sign language, underlining its potential to revolutionize communication dynamics in diverse settings.
AB - In situations where both deaf and non-deaf individuals are present in a public setting, it would be advantageous for a robot to communicate using both sign and natural languages simultaneously. This would not only address the needs for diverse users but also pave the way for a richer and more inclusive spectrum of human-robot interactions. To achieve this, a framework for a bilingual robot has been proposed in this paper. The robot exhibits the ability to articulate messages in spoken language, complemented by non-verbal cues such as expressive gestures, all while concurrently conveying information through sign language. The system can generate natural language expressions with speech audio, spontaneous prosody-based gestures, and sign language displayed on a virtual avatar on a robot's screen. The preliminary findings from this research showcase the robot's capacity to seamlessly blend natural language expressions with synchronized gestures and sign language, underlining its potential to revolutionize communication dynamics in diverse settings.
KW - Human-robot interaction
KW - gesture generation
KW - sign language
KW - virtual agent
U2 - 10.1145/3610978.3640549
DO - 10.1145/3610978.3640549
M3 - Conference contribution
AN - SCOPUS:85188115674
T3 - ACM/IEEE International Conference on Human-Robot Interaction
SP - 526
EP - 529
BT - HRI 2024 Companion - Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction
PB - IEEE Computer Society
T2 - 19th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2024
Y2 - 11 March 2024 through 15 March 2024
ER -