File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Whiskers: Exploring the Use of Ultrasonic Haptic Cues on the Face

Author(s)
Gil, HyunjaeSon, HyungkiKim, Jin RyongOakley, Ian
Issued Date
2018-04-21
DOI
10.1145/3173574.3174232
URI
https://scholarworks.unist.ac.kr/handle/201301/34270
Fulltext
https://dl.acm.org/citation.cfm?id=3174232
Citation
2018 CHI Conference on Human Factors in Computing Systems, CHI 2018
Abstract
Haptic cues are a valuable feedback mechanism for smart glasses. Prior work has shown how they can support navigation, deliver notifications and cue targets. However, a focus on actuation technologies such as mechanical tactors or fans has restricted the scope of research to a small number of cues presented at fixed locations. To move beyond this limitation, we explore perception of in-air ultrasonic haptic cues on the face. We present two studies examining the fundamental properties of localization, duration and movement perception on three facial sites suitable for use with glasses: the cheek, the center of the forehead, and above the eyebrow. The center of the forehead led to optimal performance with a localization error of 3.77mm and accurate duration (80%) and movement perception (87%). We apply these findings in a study delivering eight different ultrasonic notifications and report mean recognition rates of up to 92.4% (peak: 98.6%). We close with design recommendations for ultrasonic haptic cues on the face.
Publisher
2018 CHI Conference on Human Factors in Computing Systems, CHI 2018

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.