| dc.contributor.advisor |
Lee, Kyungho |
- |
| dc.contributor.author |
Han, Mingyu |
- |
| dc.date.accessioned |
2025-04-04T13:48:54Z |
- |
| dc.date.available |
2025-04-04T13:48:54Z |
- |
| dc.date.issued |
2025-02 |
- |
| dc.description.abstract |
Recent advancements in smart glasses and augmented reality (AR) the underlying input technique powering the majority of eyes-only interactions in this modality is dwell-based selection. However, the interactions that rely solely on eye movements are too simplistic to provide input to the computer, which affects unintentional intent called Midas touch. Multimodal with gaze had a solution for such issue. This thesis introduces EyeWithShut, an novel interaction technique that enhance interaction expressiveness with closed-eye gesture in smart glasses and other head-wearable devices. The research comprises two main experiments. The first experiment collected a dataset of approx- imately 400,000 images of closed eyes and distinguished between different gaze positions when eyes are closed was investigated. Using convolutional neural networks, the analysis focused on how factors such as distance, width, and orientation of eye movements affect classification accuracy. Results showed significant differences in accuracy between far and near conditions, with far conditions indicating higher accuracy. In the second experiment, a two-stage model for closed-eye gesture estimation was developed. The first stage focuses on eyelid contour estimation using synthetic and real eye images, while the sec- ond stage estimates gestures based on the detected eyelid movements. Different CNN architectures and fine-tuning strategies were evaluated to optimize performance on real-world closed-eye images. The findings demonstrate the potential of closed-eye gestures as a viable input method for head-wearable devices, offering a solution to unintentional inputs while maintaining user privacy and social acceptabil- ity. This research contributes to the growing field of eyes-only interaction techniques and opens new possibilities for user interfaces in head-wearable computing. |
- |
| dc.description.degree |
Master |
- |
| dc.description |
Department of Design |
- |
| dc.identifier.uri |
https://scholarworks.unist.ac.kr/handle/201301/86427 |
- |
| dc.identifier.uri |
http://unist.dcollection.net/common/orgView/200000865420 |
- |
| dc.language |
ENG |
- |
| dc.publisher |
Ulsan National Institute of Science and Technology |
- |
| dc.rights.embargoReleaseDate |
9999-12-31 |
- |
| dc.rights.embargoReleaseTerms |
9999-12-31 |
- |
| dc.subject |
Human-Computer Interaction |
- |
| dc.subject |
Head-Wearable Computing |
- |
| dc.subject |
Eye Tracking |
- |
| dc.subject |
Input Design |
- |
| dc.title |
EyeWithShut: Closed-Eye Gesture Interaction for Radial Menu Selection |
- |
| dc.type |
Thesis |
- |