Publication Details
- Keywords:
- HRI
Abstract
Studies in HRI have shown that people follow and understand robot gaze. However, only a few studies to date have examined the time-course of a meaningful robot gaze, and none have directly investigated what type of gaze is best for eliciting the perception of attention. This paper investigates two types of gaze behaviors-short, frequent glances and long, less frequent stares - to find which behavior is better at conveying a robot's visual attention. We describe the development of a programmable research platform from MyKeepon toys, and the use of these programmable robots to examine the effects of gaze type and group size on the perception of attention. In our experiment, participants viewed a group of MyKeepon robots executing random motions, occasionally fixating on various points in the room or directly on the participant. We varied type of gaze fixations within participants and group size between participants. Results show that people are more accurate at recognizing shorter, more frequent fixations than longer, less frequent ones, and that their performance improves as group size decreases. From these results, we conclude that multiple short gazes are preferable for indicating attention over one long gaze, and that the visual search for robot attention is susceptible to group size effects.
Author Details
Name: | Henny Admoni |
Status: | Inactive |
Name: | Bradley Hayes |
Status: | Inactive |
Name: | David Feil-Seifer | |
email: | dave@cse.unr.edu | |
Website: | http://cse.unr.edu/~dave | |
Phone: | (775) 784-6469 | |
Status: | Active |
Name: | Daniel Ullman |
Status: | Inactive |
Name: | Brian Scassellati |
Status: | Inactive |
BibTex Reference
title={Are You Looking At Me?: Perception of robot attention is mediated by gaze duration and group size},
author={Henny Admoni and Bradley Hayes and David Feil-Seifer and Daniel Ullman and Brian Scassellati},
year={2013},
month={March},
pages={389-396},
address={Toyko, Japan},
doi={10.1109/HRI.2013.6483614},
booktitle={Proceedings of the International Conference on Human-Robot Interaction (HRI)},
}