File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace US -
dc.citation.conferencePlace Denver -
dc.citation.endPage 3890 -
dc.citation.startPage 3879 -
dc.citation.title 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 -
dc.contributor.author Gil, Hyunjae -
dc.contributor.author Lee, DoYoung -
dc.contributor.author Im, Seunggyu -
dc.contributor.author Oakley, Ian -
dc.date.accessioned 2023-12-19T19:07:30Z -
dc.date.available 2023-12-19T19:07:30Z -
dc.date.created 2018-01-10 -
dc.date.issued 2017-05-06 -
dc.description.abstract The small screens of smartwatches provide limited space for input tasks. Finger identification is a promising technique to address this problem by associating different functions with different fingers. However, current technologies for finger identification are unavailable or unsuitable for smartwatches. To address this problem, this paper observes that normal smartwatch use takes places with a relatively static pose between the two hands. In this situation, we argue that the touch and angle profiles generated by different fingers on a standard smartwatch touch screen will differ sufficiently to support reliable identification. The viability of this idea is explored in two studies that capture touches in natural and exaggerated poses during tapping and swiping tasks. Machine learning models report accuracies of up to 93% and 98% respectively, figures that are sufficient for many common interaction tasks. Furthermore, the exaggerated poses show modest costs (in terms of time/errors) compared to the natural touches. We conclude by presenting examples and discussing how interaction designs using finger identification can be adapted to the smartwatch form factor. -
dc.identifier.bibliographicCitation 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017, pp.3879 - 3890 -
dc.identifier.doi 10.1145/3025453.3025561 -
dc.identifier.scopusid 2-s2.0-85044287447 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/34317 -
dc.identifier.url https://dl.acm.org/citation.cfm?id=3025561 -
dc.language 영어 -
dc.publisher 2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017 -
dc.title TriTap: Identifying Finger Touches on Smartwatches -
dc.type Conference Paper -
dc.date.conferenceDate 2017-05-06 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.