File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.title INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES -
dc.citation.volume 204 -
dc.contributor.author Shin, Yonghwan -
dc.contributor.author Esteves, Augusto -
dc.contributor.author Oakley, Ian -
dc.date.accessioned 2025-11-26T11:25:27Z -
dc.date.available 2025-11-26T11:25:27Z -
dc.date.created 2025-10-13 -
dc.date.issued 2025-10 -
dc.description.abstract Augmented Reality (AR) HMDs are the latest iteration in wearable computing, and the lightweight and portable form factors currently emerging are particularly suited for mobile use - they offer the potential for seamless, discreet, and contextual information to users on the go. Despite this potential, studies of input on HMDs rarely consider mobility issues. This paper seeks to rectify this omission in the context of ray pointing, one of the most essential and general-purpose input modalities in this space. We present the first study (N=24) contrasting user performance on HMDs across eye, head, and hand ray pointing while standing and walking, for both dwell and pinch gesture activations. Our results indicate walking is highly disruptive to interactions with conventional HMD UIs - in general, success rates fall precipitously while selection times rise steeply while users walk. Variations in performance between modalities and activation techniques shed light on how input techniques that are more resilient to motion could be constructed. Building on these findings, we discuss design considerations for ray pointing and interface and interaction technique designs for HMDs that may be better suited to mobile scenarios. -
dc.identifier.bibliographicCitation INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, v.204 -
dc.identifier.doi 10.1016/j.ijhcs.2025.103597 -
dc.identifier.issn 1071-5819 -
dc.identifier.scopusid 2-s2.0-105014196323 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/88628 -
dc.identifier.wosid 001576897900001 -
dc.language 영어 -
dc.publisher ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD -
dc.title Using Augmented Reality on the go: Understanding the effects of mobility on user performance and subjective workload across eye, head, and hand ray pointing -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Computer Science, Cybernetics; Ergonomics; Psychology, Multidisciplinary -
dc.relation.journalResearchArea Computer Science; Engineering; Psychology -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass ssci -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Head mounted display -
dc.subject.keywordAuthor Walking -
dc.subject.keywordAuthor AR -
dc.subject.keywordAuthor Mobile -
dc.subject.keywordAuthor Pointing -
dc.subject.keywordPlus WALKING SPEED -
dc.subject.keywordPlus ANOVA -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.