Du er ikke logget ind
Beskrivelse
How do humans perceive communicative gesture behavior in robots? Although gesture is a crucial feature of social interaction, this research question is still largely unexplored in the field of social robotics. The present work thus sets out to investigate how robot gesture can be used to design and realize more natural and human-like communication capabilities for social robots. The adopted approach is twofold. Firstly, the technical challenges encountered when implementing a speech-gesture generation model on a robotic platform are addressed. The realized framework enables a humanoid robot to produce finely synchronized speech and co-verbal hand and arm gestures. In contrast to many existing systems, these gestures are not limited to a predefined repertoire of motor actions but are flexibly generated at run-time. Secondly, the achieved expressiveness is exploited in controlled experiments to gain a deeper understanding of how robot gesture might impact human experience and evaluation of human-robot interaction. The findings reveal that participants evaluate the robot more positively when non-verbal behaviors such as hand and arm gestures are displayed along with speech. Surprisingly, this effect was particularly pronounced when the robot's gesturing behavior was partly incongruent with speech. These findings contribute new insights into human perception of communicative robot gesture and ultimately support the presented approach of endowing social robots with such non-verbal behaviors.