File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이희승

Lee, Hui Sung
Design & Electronic Convergence System Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 462 -
dc.citation.number 3-4 -
dc.citation.startPage 443 -
dc.citation.title JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS -
dc.citation.volume 78 -
dc.contributor.author Park, Jeong Woo -
dc.contributor.author Lee, Hui Sung -
dc.contributor.author Chung, Myung Jin -
dc.date.accessioned 2023-12-22T01:08:59Z -
dc.date.available 2023-12-22T01:08:59Z -
dc.date.created 2017-03-29 -
dc.date.issued 2015-06 -
dc.description.abstract One factor that contributes to successful long-term human-human interaction is that humans are able to appropriately express their emotions depending on situations. Unlike humans, robots often lack diversity in facial expressions and gestures and long-term human robot interaction (HRI) has consequently not been very successful thus far. In this paper, we propose a novel method to generate diverse and more realistic robot facial expressions to help long-term HRI. First, nine basic dynamics for robot facial expressions are determined based on the dynamics of human facial expressions and principles of animation in order to generate natural and diverse expression changes in a facial robot for identical emotions. In the second stage, facial actions are added to express more realistic expressions such as sniffling or wailing loudly corresponding to sadness, laughing aloud or smiling corresponding to happiness, etc. To evaluate the effectiveness of our approach, we compared the facial expressions of the developed robot with and without use of the proposed method. The results of the survey showed that the proposed method can help robots generate more realistic and diverse facial expressions. -
dc.identifier.bibliographicCitation JOURNAL OF INTELLIGENT & ROBOTIC SYSTEMS, v.78, no.3-4, pp.443 - 462 -
dc.identifier.doi 10.1007/s10846-014-0066-1 -
dc.identifier.issn 0921-0296 -
dc.identifier.scopusid 2-s2.0-84929521464 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/21797 -
dc.identifier.url https://link.springer.com/article/10.1007%2Fs10846-014-0066-1 -
dc.identifier.wosid 000354490300006 -
dc.language 영어 -
dc.publisher SPRINGER -
dc.title Generation of Realistic Robot Facial Expressions for Human Robot Interaction -
dc.type Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Dynamics -
dc.subject.keywordAuthor Facial actions -
dc.subject.keywordAuthor Realistic facial expressions -
dc.subject.keywordAuthor Human robot interaction -
dc.subject.keywordPlus MODEL -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.