File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

김성필

Kim, Sung-Phil
Brain-Computer Interface Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace KO -
dc.citation.conferencePlace SOUTH KOREA -
dc.citation.endPage 2027 -
dc.citation.startPage 2021 -
dc.citation.title 2021 The 21st International Conference on Control, Automation and Systems -
dc.contributor.author Choi, Yunjoo -
dc.contributor.author Kim, Minju -
dc.contributor.author Kim, Jong-su -
dc.contributor.author Heo, Dojin -
dc.contributor.author Kim, Sung-Phil -
dc.date.accessioned 2024-01-31T21:35:58Z -
dc.date.available 2024-01-31T21:35:58Z -
dc.date.created 2021-12-25 -
dc.date.issued 2021-10-15 -
dc.description.abstract The development of non-visual P300-based brain-computer interfaces (BCIs) is needed for patients with unreliable gaze control or healthy users with visual distractors. As an alternative means, auditory BCIs have been developed, but reportedly showed relatively low performance. To elucidate the performance gap, this study investigated the feature domain between the auditory and visual BCIs, along with the audiovisual BCI as the combination of the two. Not only the online test, but also a cross-modality assessment was conducted to compare the performance of three modalities, revealing that the classification performance became significantly low when the feature of the auditory BCI was included. When comparing the features that showed significant differences between the target and nontarget stimuli of each subject in each modality, significant individual differences in selected features were more pronounced in the auditory BCI than others, meaning that the common features across subjects were scarce for the auditory BCI. Moreover, the biggest decrease was shown in the auditory modality when comparing the performance of online test and Leave-One-Subject-Out (LOSO) cross validation, which was conducted with selected features. Our results suggest potential sources of the performance gap between auditory and visual BCIs in the context of feature domain. -
dc.identifier.bibliographicCitation 2021 The 21st International Conference on Control, Automation and Systems, pp.2021 - 2027 -
dc.identifier.doi 10.23919/ICCAS52745.2021.9649793 -
dc.identifier.issn 2093-7121 -
dc.identifier.scopusid 2-s2.0-85124194641 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/76889 -
dc.identifier.url https://ieeexplore.ieee.org/document/9649793 -
dc.identifier.wosid 000750950700290 -
dc.publisher ICROS -
dc.title Comparisons of Auditory, Audiovisual, and Visual Modalities in Feature Domain for Auditory Brain-Computer Interfaces -
dc.type Conference Paper -
dc.date.conferenceDate 2021-10-12 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.