File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

TriTap: Identifying Finger Touches on Smartwatches

Author(s)
Gil, HyunjaeLee, DoYoungIm, SeunggyuOakley, Ian
Issued Date
2017-05-06
DOI
10.1145/3025453.3025561
URI
https://scholarworks.unist.ac.kr/handle/201301/34317
Fulltext
https://dl.acm.org/citation.cfm?id=3025561
Citation
2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017, pp.3879 - 3890
Abstract
The small screens of smartwatches provide limited space for input tasks. Finger identification is a promising technique to address this problem by associating different functions with different fingers. However, current technologies for finger identification are unavailable or unsuitable for smartwatches. To address this problem, this paper observes that normal smartwatch use takes places with a relatively static pose between the two hands. In this situation, we argue that the touch and angle profiles generated by different fingers on a standard smartwatch touch screen will differ sufficiently to support reliable identification. The viability of this idea is explored in two studies that capture touches in natural and exaggerated poses during tapping and swiping tasks. Machine learning models report accuracies of up to 93% and 98% respectively, figures that are sufficient for many common interaction tasks. Furthermore, the exaggerated poses show modest costs (in terms of time/errors) compared to the natural touches. We conclude by presenting examples and discussing how interaction designs using finger identification can be adapted to the smartwatch form factor.
Publisher
2017 ACM SIGCHI Conference on Human Factors in Computing Systems, CHI 2017

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.