File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이희승

Lee, Hui Sung
Design & Electronic Convergence System Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 12 -
dc.citation.number 1 -
dc.citation.startPage 8 -
dc.citation.title Journal of Institute of Control, Robotics and Systems -
dc.citation.volume 30 -
dc.contributor.author Park, Haeun -
dc.contributor.author Lee, Jiyeon -
dc.contributor.author Kim, Byounghern -
dc.contributor.author Dzhoroev, Temirlan -
dc.contributor.author Lee, Hui Sung -
dc.date.accessioned 2024-01-30T14:35:08Z -
dc.date.available 2024-01-30T14:35:08Z -
dc.date.created 2024-01-30 -
dc.date.issued 2024-01 -
dc.description.abstract Social robots commonly rely on facial expressions and gestures to convey emotions. However, many robots follow a predetermined sequence, executing a set of facial animations and movement sequences once an emotion is identified. This rigid approach can lead to unnatural processing when confronted with additional stimuli during an ongoing emotional expression. This may cause the robot to ignore the new stimulus until the emotion is fully expressed, or to abruptly move on to the next one. To address this limitation, we implemented an emotion engine with a linear dynamic affect-expression model (LDAEM) that calculates the emotion based on stimuli and determines the corresponding facial expression and robot movements. By leveraging the Ekman 6 basic emotions, our emotion engine incorporates 12 control points (CPs) for facial expression and 3 CPs for movement. Experimental results demonstrate the dynamic adaptation of emotions to stimuli. Notably, our approach allows for smooth transitions between emotions, even when different emotional stimuli are introduced during an ongoing emotional expression. Moreover, it can be seamlessly applied to other robotic systems, offering a versatile framework for emotional expression. -
dc.identifier.bibliographicCitation Journal of Institute of Control, Robotics and Systems, v.30, no.1, pp.8 - 12 -
dc.identifier.doi 10.5302/J.ICROS.2024.23.0133 -
dc.identifier.issn 1976-5622 -
dc.identifier.scopusid 2-s2.0-85183033127 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/74409 -
dc.language 영어 -
dc.publisher 제어·로봇·시스템학회 -
dc.title.alternative Developing a Dynamic Expression Model That Can Simultaneously Control Robot's Facial and Movement Expressions -
dc.title Developing a Dynamic Expression Model That Can Simultaneously Control Robot's Facial and Movement Expressions -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.identifier.kciid ART003037574 -
dc.type.docType Article -
dc.description.journalRegisteredClass scopus -
dc.description.journalRegisteredClass kci -
dc.subject.keywordAuthor computational emotion model -
dc.subject.keywordAuthor human-robot interaction -
dc.subject.keywordAuthor social robot -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.