File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

EyeWithShut: Closed-Eye Gesture Interaction for Radial Menu Selection

Author(s)
Han, Mingyu
Advisor
Lee, Kyungho
Issued Date
2025-02
URI
https://scholarworks.unist.ac.kr/handle/201301/86427 http://unist.dcollection.net/common/orgView/200000865420
Abstract
Recent advancements in smart glasses and augmented reality (AR) the underlying input technique powering the majority of eyes-only interactions in this modality is dwell-based selection. However, the interactions that rely solely on eye movements are too simplistic to provide input to the computer, which affects unintentional intent called Midas touch. Multimodal with gaze had a solution for such issue. This thesis introduces EyeWithShut, an novel interaction technique that enhance interaction expressiveness with closed-eye gesture in smart glasses and other head-wearable devices. The research comprises two main experiments. The first experiment collected a dataset of approx- imately 400,000 images of closed eyes and distinguished between different gaze positions when eyes are closed was investigated. Using convolutional neural networks, the analysis focused on how factors such as distance, width, and orientation of eye movements affect classification accuracy. Results showed significant differences in accuracy between far and near conditions, with far conditions indicating higher accuracy. In the second experiment, a two-stage model for closed-eye gesture estimation was developed. The first stage focuses on eyelid contour estimation using synthetic and real eye images, while the sec- ond stage estimates gestures based on the detected eyelid movements. Different CNN architectures and fine-tuning strategies were evaluated to optimize performance on real-world closed-eye images. The findings demonstrate the potential of closed-eye gestures as a viable input method for head-wearable devices, offering a solution to unintentional inputs while maintaining user privacy and social acceptabil- ity. This research contributes to the growing field of eyes-only interaction techniques and opens new possibilities for user interfaces in head-wearable computing.
Publisher
Ulsan National Institute of Science and Technology
Degree
Master
Major
Department of Design

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.