File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Music-aided affective interaction between human and service robot

Author(s)
Park, Jeong-SikJang, Gil-JinSeo, Yong-Ho
Issued Date
2012-01
DOI
10.1186/1687-4722-2012-5
URI
https://scholarworks.unist.ac.kr/handle/201301/2559
Fulltext
http://www.scopus.com/inward/record.url?partnerID=HzOxMe3b&scp=84873858600
Citation
EURASIP JOURNAL ON AUDIO SPEECH AND MUSIC PROCESSING, v.2012, no.1
Abstract
This study proposes a music-aided framework for affective interaction of service robots with humans. The framework consists of three systems, respectively, for perception, memory, and expression on the basis of the human brain mechanism. We propose a novel approach to identify human emotions in the perception system. The conventional approaches use speech and facial expressions as representative bimodal indicators for emotion recognition. But, our approach uses the mood of music as a supplementary indicator to more correctly determine emotions along with speech and facial expressions. For multimodal emotion recognition, we propose an effective decision criterion using records of bimodal recognition results relevant to the musical mood. The memory and expression systems also utilize musical data to provide natural and affective reactions to human emotions. For evaluation of our approach, we simulated the proposed human-robot interaction with a service robot, iRobiQ. Our perception system exhibited superior performance over the conventional approach, and most human participants noted favorable reactions toward the music-aided affective interaction.
Publisher
SPRINGER INTERNATIONAL PUBLISHING AG
ISSN
1687-4722

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.