File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

OakleyIan

Oakley, Ian
Interactions Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Can Eye Gaze Improve Emotional State Detection on Off the Shelf Smart Devices Jiwan Kim Doyoung Lee Jaeho Kim

Author(s)
Kim, JiwanLee, D.Kim, J.Oakley, Ian
Issued Date
2023-02-13
DOI
10.1109/BigComp57234.2023.00090
URI
https://scholarworks.unist.ac.kr/handle/201301/74884
Citation
IEEE International Conference on Big Data and Smart Computing, pp.378 - 380
Abstract
Smartphones and wearable technology have revolutionized digital healthcare through the use of rich sensor data. Digital phenotyping, a field that uses smartphone data to detect or recognize cognitive, behavioral, or affective states and traits, is often based on data from sensors (such as inertial or touch sensors), activity or user logs, and user-generated content. In this paper, we propose the use of eye gaze as a new digital biomarker for affect detection, leveraging the advanced capabilities of off the-shelf smart devices. We designed two studies in the Instagram use scenario to detect emotional state change using gaze data. The first study took place in a controlled setting and help us understand the value of gaze features for affect detection. In this study, we achieved a peak accuracy of 76.4% for binary valence detection. The second study will be conducted over a longer period in real-world settings, with a larger population, to assess the effectiveness of our approach. By including this new sensing modality in affective digital phenotyping, we plan to improve the reliability and robustness of emotional state detection. © 2023 IEEE.
Publisher
Institute of Electrical and Electronics Engineers Inc.

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.