File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.number 4 -
dc.citation.startPage 167 -
dc.citation.title PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWU -
dc.citation.volume 6 -
dc.contributor.author Huh, Jun Ho -
dc.contributor.author Shin, Hyejin -
dc.contributor.author Kim, Hongmin -
dc.contributor.author Cheon, Eunyong -
dc.contributor.author Song, Youngeun -
dc.contributor.author Lee, Choong-hoon -
dc.contributor.author Oakley, Ian -
dc.date.accessioned 2023-12-21T13:12:34Z -
dc.date.available 2023-12-21T13:12:34Z -
dc.date.created 2023-01-06 -
dc.date.issued 2023-01 -
dc.description.abstract PIN and pattern lock are difficult to accurately enter on small watch screens, and are vulnerable against guessing attacks. To address these problems, this paper proposes a novel implicit biometric scheme based on through-wrist acoustic responses. A cue signal is played on a surface transducer mounted on the dorsal wrist and the acoustic response recorded by a contact microphone on the volar wrist. We build classifiers using these recordings for each of three simple hand poses (relax, fist and open), and use an ensemble approach to make final authentication decisions. In an initial single session study (N=25), we achieve an Equal Error Rate (EER) of 0.01%, substantially outperforming prior on-wrist biometric solutions. A subsequent five recall-session study (N=20) shows reduced performance with 5.06% EER. We attribute this to increased variability in how participants perform hand poses over time. However, after retraining classifiers performance improved substantially, ultimately achieving 0.79% EER. We observed most variability with the relax pose. Consequently, we achieve the most reliable multi-session performance by combining the fist and open poses: 0.51% EER. Further studies elaborate on these basic results. A usability evaluation reveals users experience low workload as well as reporting high SUS scores and fluctuating levels of perceived exertion: moderate during initial enrollment dropping to slight during authentication. A final study examining performance in various poses and in the presence of noise demonstrates the system is robust to such disturbances and likely to work well in wide range of real-world contexts. -
dc.identifier.bibliographicCitation PROCEEDINGS OF THE ACM ON INTERACTIVE MOBILE WEARABLE AND UBIQUITOUS TECHNOLOGIES-IMWU, v.6, no.4, pp.167 -
dc.identifier.doi 10.1145/3569473 -
dc.identifier.issn 2474-9567 -
dc.identifier.scopusid 2-s2.0-85146417101 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/60874 -
dc.language 영어 -
dc.publisher Association for Computing Machinery (ACM) -
dc.title WristAcoustic: Through-Wrist Acoustic Response Based Authentication for Smartwatches -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.type.docType Article -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor acoustic response -
dc.subject.keywordAuthor bone conduction -
dc.subject.keywordAuthor smartwatch authentication -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.