File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이희승

Lee, Hui Sung
Design & Electronic Convergence System Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace KO -
dc.citation.conferencePlace Seoul -
dc.citation.title 17th World Congress, International Federation of Automatic Control, IFAC -
dc.contributor.author Lee, Hui Sung -
dc.contributor.author Park, J.W. -
dc.contributor.author Jo, S.H. -
dc.contributor.author Kim, M.-G. -
dc.contributor.author Lee, W. -
dc.contributor.author Chung M.J. -
dc.date.accessioned 2023-12-20T04:36:54Z -
dc.date.available 2023-12-20T04:36:54Z -
dc.date.created 2017-04-06 -
dc.date.issued 2008-07-06 -
dc.description.abstract The interest of emotional interaction between robots and humans is recently increasing in human-robot interaction as with expansion of research on human-friendly robot. The facial expression is the most instinctive and impromptu way to express emotion compared with other emotional methods like a voice, gesture, skin color, etc. People can recognize a robot's condition intuitively if the robot shows its internal state with facial expressions. Although emotional expressions of the face in humanoids or androids have been attempted in many ways, the results have not reached the level that common people are satisfied with. Therefore, the objective of this research is to design and implement a mascot-type facial robot that can show emotional expressions effectively. In addition, a linear dynamic affect-expression model is applied to the facial robot to display continuous and various expressions. A mascot-type facial robot is designed differing from android and mechanical type, since we can easily escape the uncanny valley effect and achieve higher familiarity with a mascot-type design. A small digital controller is implemented and installed into a designed outer form for controlling ten small DC motors and two LEDs. The developed robot is capable of showing rich expressions with LED and dynamic neck motion that is composed of two blushless DC motors. Two USB cameras are installed in the robot eyes. The robot system can be connected to common personal computers or laptops by wireless LAN, which is used for transferring affect and visual data from remote place. An operator can easily control the various facial expressions and LEDs using only three parameters based on the linear dynamic affect-expression model. Therefore, the facial robot system can be used as guiding or monitoring system with emotional expressions. -
dc.identifier.bibliographicCitation 17th World Congress, International Federation of Automatic Control, IFAC -
dc.identifier.doi 10.3182/20080706-5-KR-1001.1895 -
dc.identifier.issn 1474-6670 -
dc.identifier.scopusid 2-s2.0-79961018585 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/34476 -
dc.language 영어 -
dc.publisher 17th World Congress, International Federation of Automatic Control, IFAC -
dc.title A mascot-type facial robot with a linear dynamic affect-expression model -
dc.type Conference Paper -
dc.date.conferenceDate 2008-07-06 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.