File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 532 -
dc.citation.number 6 -
dc.citation.startPage 515 -
dc.citation.title INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES -
dc.citation.volume 67 -
dc.contributor.author Oakley, Ian -
dc.contributor.author Park, Junseok -
dc.date.accessioned 2023-12-22T08:06:16Z -
dc.date.available 2023-12-22T08:06:16Z -
dc.date.created 2014-11-07 -
dc.date.issued 2009-06 -
dc.description.abstract The increasing complexity of applications on handheld devices requires the development of rich new interaction methods specifically designed for resource-limited mobile use contexts. One appealingly convenient approach to this problem is to use device motions as input, a paradigm in which the currently dominant interaction metaphors are gesture recognition and visually mediated scrolling. However, neither is ideal. The former suffers from fundamental problems in the learning and communication of gestural patterns, while the latter requires continual visual monitoring of the mobile device, a task that is undesirable in many mobile contexts and also inherently in conflict with the act of moving a device to control it. This paper proposes an alternate approach: a gestural menu technique inspired by marking menus and designed specifically for the characteristics of motion input. It uses rotations between targets occupying large portions of angular space and emphasizes kinesthetic, eyes-free interaction. Three evaluations are presented, two featuring an abstract user interface (UI) and focusing on how user performance changes when the basic system parameters of number, size and depth of targets are manipulated. These studies show that a version of the menu system containing 19 commands yields optimal performance, compares well against data from the previous literature and can be used effectively eyes free (without graphical feedback). The final study uses a full graphical UI and untrained users to demonstrate that the system can be rapidly learnt. Together, these three studies rigorously validate the system design and suggest promising new directions for handheld motion-based UIs. -
dc.identifier.bibliographicCitation INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, v.67, no.6, pp.515 - 532 -
dc.identifier.doi 10.1016/j.ijhcs.2009.02.002 -
dc.identifier.issn 1071-5819 -
dc.identifier.scopusid 2-s2.0-63449125157 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/8436 -
dc.identifier.url https://www.sciencedirect.com/science/article/pii/S1071581909000226?via%3Dihub -
dc.identifier.wosid 000265802200004 -
dc.language 영어 -
dc.publisher ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD -
dc.title Motion marking menus: An eyes-free approach to motion input for handheld devices -
dc.type Article -
dc.description.journalRegisteredClass scopus -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.