Using Augmented Reality on the go: Understanding the effects of mobility on user performance and subjective workload across eye, head, and hand ray pointing
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, v.204
Abstract
Augmented Reality (AR) HMDs are the latest iteration in wearable computing, and the lightweight and portable form factors currently emerging are particularly suited for mobile use - they offer the potential for seamless, discreet, and contextual information to users on the go. Despite this potential, studies of input on HMDs rarely consider mobility issues. This paper seeks to rectify this omission in the context of ray pointing, one of the most essential and general-purpose input modalities in this space. We present the first study (N=24) contrasting user performance on HMDs across eye, head, and hand ray pointing while standing and walking, for both dwell and pinch gesture activations. Our results indicate walking is highly disruptive to interactions with conventional HMD UIs - in general, success rates fall precipitously while selection times rise steeply while users walk. Variations in performance between modalities and activation techniques shed light on how input techniques that are more resilient to motion could be constructed. Building on these findings, we discuss design considerations for ray pointing and interface and interaction technique designs for HMDs that may be better suited to mobile scenarios.