File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이희승

Lee, Hui Sung
Design & Electronic Convergence System Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Emotional boundaries for choosing modalities according to the intensity of emotion in a linear affect-expression space

Author(s)
Park, Jin WooLee, Hui SungJo, Su HunKim, Min-gyuChung,Myung Jin
Issued Date
2008-08-01
DOI
10.1109/ROMAN.2008.4600670
URI
https://scholarworks.unist.ac.kr/handle/201301/34475
Fulltext
http://ieeexplore.ieee.org/document/4600670/
Citation
17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN, pp.225 - 230
Abstract
Recently, in the field of HRI, multimodal expression has been an issue. Synchronizing modalities and determining what modality to use are important aspect of multimodal expression. For example, when robots express emotional states, they may use only facial expressions or facial expressions with gestures, neck motions, sounds, etc. In this paper, emotional boundaries are proposed for multimodal expression in a three-dimensional affect space. The simultaneous expression of facial expression and gestures was demonstrated using proposed emotional boundaries on a simulator.
Publisher
17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.