BROWSE

Related Researcher

Author's Photo

Oakley, Ian
Interactions Lab
Research Interests
  • Interaction Design, Tangible Computing, Social Computing, Multi-modal Interfaces

ITEM VIEW & DOWNLOAD

Motion marking menus: An eyes-free approach to motion input for handheld devices

Cited 4 times inthomson ciCited 9 times inthomson ci
Title
Motion marking menus: An eyes-free approach to motion input for handheld devices
Author
Oakley, IanPark, Junseok
Issue Date
2009-06
Publisher
ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD
Citation
INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, v.67, no.6, pp.515 - 532
Abstract
The increasing complexity of applications on handheld devices requires the development of rich new interaction methods specifically designed for resource-limited mobile use contexts. One appealingly convenient approach to this problem is to use device motions as input, a paradigm in which the currently dominant interaction metaphors are gesture recognition and visually mediated scrolling. However, neither is ideal. The former suffers from fundamental problems in the learning and communication of gestural patterns, while the latter requires continual visual monitoring of the mobile device, a task that is undesirable in many mobile contexts and also inherently in conflict with the act of moving a device to control it. This paper proposes an alternate approach: a gestural menu technique inspired by marking menus and designed specifically for the characteristics of motion input. It uses rotations between targets occupying large portions of angular space and emphasizes kinesthetic, eyes-free interaction. Three evaluations are presented, two featuring an abstract user interface (UI) and focusing on how user performance changes when the basic system parameters of number, size and depth of targets are manipulated. These studies show that a version of the menu system containing 19 commands yields optimal performance, compares well against data from the previous literature and can be used effectively eyes free (without graphical feedback). The final study uses a full graphical UI and untrained users to demonstrate that the system can be rapidly learnt. Together, these three studies rigorously validate the system design and suggest promising new directions for handheld motion-based UIs.
URI
https://scholarworks.unist.ac.kr/handle/201301/8436
URL
https://www.sciencedirect.com/science/article/pii/S1071581909000226?via%3Dihub
DOI
10.1016/j.ijhcs.2009.02.002
ISSN
1071-5819
Appears in Collections:
DHE_Journal Papers
Files in This Item:
There are no files associated with this item.

find_unist can give you direct access to the published full text of this article. (UNISTARs only)

Show full item record

qrcode

  • mendeley

    citeulike

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

MENU