File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이희승

Lee, Hui Sung
Design & Electronic Convergence System Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

A mascot-type facial robot with a linear dynamic affect-expression model

Author(s)
Lee, Hui SungPark, J.W.Jo, S.H.Kim, M.-G.Lee, W.Chung M.J.
Issued Date
2008-07-06
DOI
10.3182/20080706-5-KR-1001.1895
URI
https://scholarworks.unist.ac.kr/handle/201301/34476
Citation
17th World Congress, International Federation of Automatic Control, IFAC
Abstract
The interest of emotional interaction between robots and humans is recently increasing in human-robot interaction as with expansion of research on human-friendly robot. The facial expression is the most instinctive and impromptu way to express emotion compared with other emotional methods like a voice, gesture, skin color, etc. People can recognize a robot's condition intuitively if the robot shows its internal state with facial expressions. Although emotional expressions of the face in humanoids or androids have been attempted in many ways, the results have not reached the level that common people are satisfied with. Therefore, the objective of this research is to design and implement a mascot-type facial robot that can show emotional expressions effectively. In addition, a linear dynamic affect-expression model is applied to the facial robot to display continuous and various expressions. A mascot-type facial robot is designed differing from android and mechanical type, since we can easily escape the uncanny valley effect and achieve higher familiarity with a mascot-type design. A small digital controller is implemented and installed into a designed outer form for controlling ten small DC motors and two LEDs. The developed robot is capable of showing rich expressions with LED and dynamic neck motion that is composed of two blushless DC motors. Two USB cameras are installed in the robot eyes. The robot system can be connected to common personal computers or laptops by wireless LAN, which is used for transferring affect and visual data from remote place. An operator can easily control the various facial expressions and LEDs using only three parameters based on the linear dynamic affect-expression model. Therefore, the facial robot system can be used as guiding or monitoring system with emotional expressions.
Publisher
17th World Congress, International Federation of Automatic Control, IFAC
ISSN
1474-6670

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.