File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이희승

Lee, Hui Sung
Design & Electronic Convergence System Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

A linear dynamic affect-expression model: Facial expressions according to perceived emotions in mascot-type facial robots

Author(s)
Lee, Hui SungPark, Jeong WooJo, Su HoChung, Myung Jin
Issued Date
2007-08-26
DOI
10.1109/ROMAN.2007.4415158
URI
https://scholarworks.unist.ac.kr/handle/201301/34480
Fulltext
http://ieeexplore.ieee.org/document/4415158/
Citation
16th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN, pp.619 - 624
Abstract
It is expected that robots will be widely exposed to humans in the near future. Emotional communication is very important in human-robot interaction, as well as human-human interaction. Facial expression is an emotional expression method between humans, and it also enables human to recognize a robot's emotional state. Although there are lots of previous research regarding facial expressions, it is not easily applicable to different shapes of robot faces, if the number and types of the robot's control points are varied. In addition, the natural connection between emotions has not been considered or has been inefficiently implemented in previous research. In this paper, we propose a linear dynamic affect-expression model for continuous change of expression and diversifying characteristics of expressional changes. Therefore, the proposed model allows a robot's facial expression and expression changes to more closely resemble those of humans, and this model is applicable to various mascot-type robots irrespective of the number and types of robot's control points
Publisher
16th IEEE International Conference on Robot and Human Interactive Communication, RO-MAN

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.