File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

허성국

Heo, Seongkook
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Take My Hand: Automated Hand-Based Spatial Guidance for the Visually Impaired

Author(s)
Rahman, AdilAzim, Md Aashikur RahmanHeo, Seongkook
Issued Date
2023-04-23
DOI
10.1145/3544548.3581415
URI
https://scholarworks.unist.ac.kr/handle/201301/91151
Fulltext
https://dl.acm.org/doi/full/10.1145/3544548.3581415
Citation
ACM CHI Conference on Human Factors in Computing Systems
Abstract
Tasks that involve locating objects and then moving hands to those specific locations, such as using touchscreens or grabbing objects on a desk, are challenging for the visually impaired. Over the years, audio guidance and haptic feedback have been a staple in hand navigation based assistive technologies. However, these methods require the user to interpret the generated directional cues and then manually perform the hand motions. In this paper, we present automated hand-based spatial guidance to bridge the gap between guidance and execution, allowing visually impaired users to move their hands between two points automatically, without any manual effort. We implement this concept through FingerRover, an on-finger miniature robot that carries the user’s finger to target points. We demonstrate the potential applications that can benefit from automated hand-based spatial guidance. Our user study shows the potential of our technique in improving the interaction capabilities of people with visual impairments.
Publisher
ACM

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.