File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.startPage UNSP 10241 -
dc.citation.title INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES -
dc.citation.volume 139 -
dc.contributor.author Esteves, Augusto -
dc.contributor.author Shin, Yonghwan -
dc.contributor.author Oakley, Ian -
dc.date.accessioned 2023-12-21T17:16:51Z -
dc.date.available 2023-12-21T17:16:51Z -
dc.date.created 2020-05-19 -
dc.date.issued 2020-07 -
dc.description.abstract Head movements are a common input modality on VR/AR headsets. However, although they enable users to control a cursor, they lack an integrated method to trigger actions. Many approaches exist to fill this gap: dedicated "clickers", on-device buttons, mid-air gestures, dwell, speech and new input techniques based on matching head motions to those of visually presented targets. These proposals are diverse and there is a current lack of empirical data on the performance of, experience of, and preference for these different techniques. This hampers the ability of designers to select appropriate input techniques to deploy. We conduct two studies that address this problem. A Fitts' Law study compares five traditional selection techniques and concludes that clicker (hands-on) and dwell (hands-free) provide optimal combinations of precision, speed and physical load. A follow-up study compares clicker and dwell to a motion matching implementation. While clicker remains fastest and dwell most accurate, motion matching may provide a valuable compromise between these two poles. -
dc.identifier.bibliographicCitation INTERNATIONAL JOURNAL OF HUMAN-COMPUTER STUDIES, v.139, pp.UNSP 10241 -
dc.identifier.doi 10.1016/j.ijhcs.2020.102414 -
dc.identifier.issn 1071-5819 -
dc.identifier.scopusid 2-s2.0-85080053958 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/32190 -
dc.identifier.url https://www.sciencedirect.com/science/article/pii/S1071581920300185?via%3Dihub -
dc.identifier.wosid 000528280000004 -
dc.language 영어 -
dc.publisher ACADEMIC PRESS LTD- ELSEVIER SCIENCE LTD -
dc.title Comparing selection mechanisms for gaze input techniques in head-mounted displays -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Computer Science, Cybernetics; Ergonomics; Psychology, Multidisciplinary -
dc.relation.journalResearchArea Computer Science; Engineering; Psychology -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass ssci -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Hands-free input -
dc.subject.keywordAuthor Head pointing -
dc.subject.keywordAuthor Head-mounted display -
dc.subject.keywordAuthor Virtual-reality -
dc.subject.keywordAuthor Augmented-reality -
dc.subject.keywordAuthor Gaze input -
dc.subject.keywordAuthor Motion matching -
dc.subject.keywordPlus AUGMENTED REALITY VISUALIZATION -
dc.subject.keywordPlus SYSTEM -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.