INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, v.67, no.6, pp.515 - 532
Abstract
The increasing complexity of applications on handheld devices requires the development of rich new interaction methods specifically designed for resource-limited mobile use contexts. One appealingly convenient approach to this problem is to use device motions as input, a paradigm in which the currently dominant interaction metaphors are gesture recognition and visually mediated scrolling. However, neither is ideal. The former suffers from fundamental problems in the learning and communication of gestural patterns, while the latter requires continual visual monitoring of the mobile device, a task that is undesirable in many mobile contexts and also inherently in conflict with the act of moving a device to control it. This paper proposes an alternate approach: a gestural menu technique inspired by marking menus and designed specifically for the characteristics of motion input. It uses rotations between targets occupying large portions of angular space and emphasizes kinesthetic, eyes-free interaction. Three evaluations are presented, two featuring an abstract user interface (UI) and focusing on how user performance changes when the basic system parameters of number, size and depth of targets are manipulated. These studies show that a version of the menu system containing 19 commands yields optimal performance, compares well against data from the previous literature and can be used effectively eyes free (without graphical feedback). The final study uses a full graphical UI and untrained users to demonstrate that the system can be rapidly learnt. Together, these three studies rigorously validate the system design and suggest promising new directions for handheld motion-based UIs.