File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

허성국

Heo, Seongkook
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Your Hands Can Tell: Detecting Redirected Hand Movements in Virtual Reality

Author(s)
Azim, Md Aashikur RahmanSu, ZihaoHeo, Seongkook
Issued Date
2025-04-26
DOI
10.1145/3706598.3713679
URI
https://scholarworks.unist.ac.kr/handle/201301/91146
Fulltext
https://dl.acm.org/doi/full/10.1145/3706598.3713679
Citation
ACM CHI Conference on Human Factors in Computing Systems
Abstract
In-air hand interactions are prevalent in Virtual Reality (VR), and prior studies have shown that manipulating the visual movement of the hand to be different from the actual hand movement, i.e., hand redirection, could create a more immersive and engaging VR experience. However, this manipulation risks degrading task performance and, if maliciously applied, poses a threat to user safety. Such manipulations may arise from VR applications developed with intentional or inadvertent perceptual manipulations that yield harmful outcomes. We advocate for a user’s prerogative to be informed of any such potential manipulations before application usage. To address this, our study introduces an Autoencoder-based anomaly detection technique that leverages users’ inherent hand movements to identify hand redirection, thereby preserving the integrity of application use. Our model is trained on regular (i.e., non-manipulated) hand movement patterns and employs a stochastic thresholding approach for anomaly detection. We validated our method through a technical evaluation involving 21 participants engaged in reaching tasks under manipulated and non-manipulated scenarios. The results demonstrated a high accuracy of hand redirection detection at 93.7%, with an F1-score of 93.9%.
Publisher
ACM

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.