AUTHOR=Fiore Stephen M., Wiltshire Travis J., Lobato Emilio J., Jentsch Florian G., Huang Wesley H., Axelrod Benjamin TITLE=Toward understanding social cues and signals in human–robot interaction: effects of robot gaze and proxemic behavior JOURNAL=Frontiers in Psychology VOLUME=4 YEAR=2013 URL=https://www.frontiersin.org/journals/psychology/articles/10.3389/fpsyg.2013.00859 DOI=10.3389/fpsyg.2013.00859 ISSN=1664-1078 ABSTRACT=

As robots are increasingly deployed in settings requiring social interaction, research is needed to examine the social signals perceived by humans when robots display certain social cues. In this paper, we report a study designed to examine how humans interpret social cues exhibited by robots. We first provide a brief overview of perspectives from social cognition in humans and how these processes are applicable to human–robot interaction (HRI). We then discuss the need to examine the relationship between social cues and signals as a function of the degree to which a robot is perceived as a socially present agent. We describe an experiment in which social cues were manipulated on an iRobot AvaTM mobile robotics platform in a hallway navigation scenario. Cues associated with the robot’s proxemic behavior were found to significantly affect participant perceptions of the robot’s social presence and emotional state while cues associated with the robot’s gaze behavior were not found to be significant. Further, regardless of the proxemic behavior, participants attributed more social presence and emotional states to the robot over repeated interactions than when they first interacted with it. Generally, these results indicate the importance for HRI research to consider how social cues expressed by a robot can differentially affect perceptions of the robot’s mental states and intentions. The discussion focuses on implications for the design of robotic systems and future directions for research on the relationship between social cues and signals.