File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이희승

Lee, Hui Sung
Design & Electronic Convergence System Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace KO -
dc.citation.conferencePlace Jeju -
dc.citation.endPage 624 -
dc.citation.startPage 619 -
dc.citation.title 16th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN -
dc.contributor.author Lee, Hui Sung -
dc.contributor.author Park, Jeong Woo -
dc.contributor.author Jo, Su Ho -
dc.contributor.author Chung, Myung Jin -
dc.date.accessioned 2023-12-20T04:40:32Z -
dc.date.available 2023-12-20T04:40:32Z -
dc.date.created 2017-04-06 -
dc.date.issued 2007-08-26 -
dc.description.abstract It is expected that robots will be widely exposed to humans in the near future. Emotional communication is very important in human-robot interaction, as well as human-human interaction. Facial expression is an emotional expression method between humans, and it also enables human to recognize a robot's emotional state. Although there are lots of previous research regarding facial expressions, it is not easily applicable to different shapes of robot faces, if the number and types of the robot's control points are varied. In addition, the natural connection between emotions has not been considered or has been inefficiently implemented in previous research. In this paper, we propose a linear dynamic affect-expression model for continuous change of expression and diversifying characteristics of expressional changes. Therefore, the proposed model allows a robot's facial expression and expression changes to more closely resemble those of humans, and this model is applicable to various mascot-type robots irrespective of the number and types of robot's control points -
dc.identifier.bibliographicCitation 16th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN, pp.619 - 624 -
dc.identifier.doi 10.1109/ROMAN.2007.4415158 -
dc.identifier.scopusid 2-s2.0-48749129158 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/34480 -
dc.identifier.url http://ieeexplore.ieee.org/document/4415158/ -
dc.language 영어 -
dc.publisher 16th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN -
dc.title A linear dynamic affect-expression model: Facial expressions according to perceived emotions in mascot-type facial robots -
dc.type Conference Paper -
dc.date.conferenceDate 2007-08-26 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.