File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

오현동

Oh, Hyondong
Autonomous Systems Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Enhanced location tracking in sensor fusion-assisted virtual reality micro-manipulation environments

Author(s)
Prada, John David PrietoIm, JintaekOh, HyondongSong, Cheol
Issued Date
2021-12
DOI
10.1371/journal.pone.0261933
URI
https://scholarworks.unist.ac.kr/handle/201301/55898
Fulltext
https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0261933
Citation
PLOS ONE, v.16, no.12, pp.e0261933
Abstract
Virtual reality (VR) technology plays a significant role in many biomedical applications. These VR scenarios increase the valuable experience of tasks requiring great accuracy with human subjects. Unfortunately, commercial VR controllers have large positioning errors in a micro-manipulation task. Here, we propose a VR-based framework along with a sensor fusion algorithm to improve the microposition tracking performance of a microsurgical tool. To the best of our knowledge, this is the first application of Kalman filter in a millimeter scale VR environment, by using the position data between the VR controller and an inertial measuring device. This study builds and tests two cases: (1) without sensor fusion tracking and (2) location tracking with active sensor fusion. The static and dynamic experiments demonstrate that the Kalman filter can provide greater precision during micro-manipulation in small scale VR scenarios.
Publisher
PUBLIC LIBRARY SCIENCE
ISSN
1932-6203

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.