File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Visual Guidance for a Spatial Discrepancy Problem of in Encountered-Type Haptic Display

Author(s)
Lee, Chang-GyuDunn, Gregory LynnOakley, IanRyu, Jeha
Issued Date
2020-04
DOI
10.1109/TSMC.2017.2719037
URI
https://scholarworks.unist.ac.kr/handle/201301/23980
Fulltext
https://ieeexplore.ieee.org/document/7982659/
Citation
IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS: SYSTEMS, v.50, no.4, pp.1384 - 1394
Abstract
In virtual environments, spatial discrepancies between visual and haptic scenes negatively impact user performance and experience. This paper shows how spatial discrepancies due to pose differences can occur in a haptic augmented virtuality system with an encountered-type haptic display. To mitigate this problem, we propose visual guidance, an algorithm that dynamically manipulates the visual scene to compensate for discrepancies. The effectiveness of this algorithm was verified in a pair of studies involving a button pressing task and spatial discrepancies between ±150 mm and ±40°. Experimental results show that discrepant trials using the technique yield error rates and a number of speed peaks (representing the number of targeting movements) that are comparable to those attained in trials with zero spatial discrepancy. This result was also achieved without requiring a dedicated adaptation or training process, ensuring the algorithm can be used immediately by users. A pair of follow-up studies also indicates the algorithm has little impact on subjective ratings of simulator sickness, suggesting that sporadic use of the algorithm will not negatively affect user's experience of a virtual environment. We believe that the visual guidance algorithm presented in this paper can be used to create more useful and compelling experiences in various haptic training applications incorporating encountered-type haptic displays.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
ISSN
2168-2216
Keyword (Author)
Haptic interfacesVisualizationTrainingToolsRobotsAugmented virtualitySafetyEncountered-type haptic displayhaptic augmented virtuality (HAV)performance evaluationspatial discrepancyvisual guidance
Keyword
SICKNESS

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.