TY - GEN
T1 - More Than Meets the Eye? An Experimental Design to Test Robot Visual Perspective-Taking Facilitators Beyond Mere-Appearance
AU - Currie, Joel
AU - Mcdonough, Katrina Louise
AU - Wykowska, Agnieszka
AU - Giannaccini, Maria Elena
AU - Bach, Patric
N1 - Publisher Copyright:
© 2024 Copyright held by the owner/author(s)
PY - 2024/3/11
Y1 - 2024/3/11
N2 - Visual Perspective Taking (VPT) underpins human social interaction, from joint action to predicting others' future actions and mentalizing about their goals and affective/mental states. Substantial progress has been made in developing artificial VPT capabilities in robots. However, as conventional VPT tasks rely on the (non-situated, disembodied) presentation of robots on computer screens, it is unclear how a robot's socially reactive and goal-directed behaviours prompt people to take its perspective. We provide a novel experimental paradigm that robustly measures the extent to which human interaction partners take a robot's visual perspective during face-to-face human-robot-interactions, by measuring how much a robot's visual perspective is spontaneously integrated with one's own. The experimental task design of our upcoming user study allows us to investigate the role of robot features beyond its human-like appearance, which have driven research so far, targeting instead its socially reactive behaviour and task engagement with the human interaction partner.
AB - Visual Perspective Taking (VPT) underpins human social interaction, from joint action to predicting others' future actions and mentalizing about their goals and affective/mental states. Substantial progress has been made in developing artificial VPT capabilities in robots. However, as conventional VPT tasks rely on the (non-situated, disembodied) presentation of robots on computer screens, it is unclear how a robot's socially reactive and goal-directed behaviours prompt people to take its perspective. We provide a novel experimental paradigm that robustly measures the extent to which human interaction partners take a robot's visual perspective during face-to-face human-robot-interactions, by measuring how much a robot's visual perspective is spontaneously integrated with one's own. The experimental task design of our upcoming user study allows us to investigate the role of robot features beyond its human-like appearance, which have driven research so far, targeting instead its socially reactive behaviour and task engagement with the human interaction partner.
KW - Human-Robot Interaction
KW - humanoid robot
KW - mind perception
KW - non-verbal behaviours
KW - Perspective-Taking
UR - http://www.scopus.com/inward/record.url?scp=85188067896&partnerID=8YFLogxK
U2 - 10.1145/3610978.3640684
DO - 10.1145/3610978.3640684
M3 - Conference contribution
AN - SCOPUS:85188067896
T3 - ACM/IEEE International Conference on Human-Robot Interaction
SP - 359
EP - 363
BT - HRI 2024 Companion - Companion of the 2024 ACM/IEEE International Conference on Human-Robot Interaction
PB - The Institute of Electrical and Electronics Engineers (IEEE)
T2 - 19th Annual ACM/IEEE International Conference on Human-Robot Interaction, HRI 2024
Y2 - 11 March 2024 through 15 March 2024
ER -