| dc.description.abstract |
The introduction of smart glasses as a powerful contender for next-generation innovation to the general public has started, and they can be a potential alternative to smartphone usage while walking. However, there is still a lack of understanding regarding using smart glasses while walking, and there is a need to enhance the proactive understanding of smart glasses to ensure their safe adoption and development. This research aimed to contribute by understanding users’ biomechanical responses, driven by two main questions: Q1) Does text displayed in various areas of smart glasses influence users’ head orientation and walking patterns? Q2) Does video displayed on full-screen smart glasses affect users’ head orientation and walking patterns? With a treadmill-based experimental setup, 25 participants read and watched text and video content displayed on the smart glasses while walking. With a 3D motion capture system, participants’ head orientation, pelvis position, cadence, and left/right step length were tracked, and their changes along text and video display were compared within a participant. The findings reveal significant effects of text display on head rotation, with participants rotating their heads towards the displayed text, potentially obstructing their forward field of view. Notably, participants exhibited habitual head rotations but experienced unusual phenomena that failed to compensate for natural eye-head coordination due to the smart glasses’ head-mounted displays trait. Conversely, the appearance of full-screen videos did not induce head rotation but was associated with slower walking speeds and increased walking variability, suggesting cognitive load. The shifted field of view, cognitive load, and unfamiliar usability could be potential risks for using smart glasses while walking, and this study emphasizes the importance of preparing appropriate safety guidelines or exploring software-based compensation methods to enhance user experience and safety. |
- |