File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

권오상

Kwon, Oh-Sang
Perception, Action, & Learning Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 1416 -
dc.citation.startPage 1401 -
dc.citation.title COGNITIVE NEURODYNAMICS -
dc.citation.volume 17 -
dc.contributor.author Choi, Yun-Joo -
dc.contributor.author Kwon, Oh-Sang -
dc.contributor.author Kim, Sung-Phil -
dc.date.accessioned 2023-12-21T13:19:55Z -
dc.date.available 2023-12-21T13:19:55Z -
dc.date.created 2022-12-05 -
dc.date.issued 2023-12 -
dc.description.abstract Non-invasive brain-computer interfaces (BCIs) based on an event-related potential (ERP) component, P300, elicited via the oddball paradigm, have been extensively developed to enable device control and communication. While most P300-based BCIs employ visual stimuli in the oddball paradigm, auditory P300-based BCIs also need to be developed for users with unreliable gaze control or limited visual processing. Specifically, auditory BCIs without additional visual support or multi-channel sound sources can broaden the application areas of BCIs. This study aimed to design optimal stimuli for auditory BCIs among artificial (e.g., beep) and natural (e.g., human voice and animal sounds) sounds in such circumstances. In addition, it aimed to investigate differences between auditory and visual stimulations for online P300-based BCIs. As a result, natural sounds led to both higher online BCI performance and larger differences in ERP amplitudes between the target and non-target compared to artificial sounds. However, no single type of sound offered the best performance for all subjects; rather, each subject indicated different preferences between the human voice and animal sound. In line with previous reports, visual stimuli yielded higher BCI performance (average 77.56%) than auditory counterparts (average 54.67%). In addition, spatiotemporal patterns of the differences in ERP amplitudes between target and non-target were more dynamic with visual stimuli than with auditory stimuli. The results suggest that selecting a natural auditory stimulus optimal for individual users as well as making differences in ERP amplitudes between target and non-target stimuli more dynamic may further improve auditory P300-based BCIs. -
dc.identifier.bibliographicCitation COGNITIVE NEURODYNAMICS, v.17, pp.1401 - 1416 -
dc.identifier.doi 10.1007/s11571-022-09901-3 -
dc.identifier.issn 1871-4080 -
dc.identifier.scopusid 2-s2.0-85142264724 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/60101 -
dc.identifier.wosid 000885207500001 -
dc.language 영어 -
dc.publisher SPRINGER -
dc.title Design of auditory P300-based brain-computer interfaces with a single auditory channel and no visual support -
dc.type Article -
dc.description.isOpenAccess FALSE -
dc.relation.journalWebOfScienceCategory Neurosciences -
dc.relation.journalResearchArea Neurosciences & Neurology -
dc.type.docType Article; Early Access -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Non-invasive brain-computer interface -
dc.subject.keywordAuthor Event-related potential -
dc.subject.keywordAuthor P300 -
dc.subject.keywordAuthor Auditory brain-computer interface -
dc.subject.keywordAuthor Sound design -
dc.subject.keywordPlus AMYOTROPHIC LATERAL SCLEROSIS -
dc.subject.keywordPlus P300 -
dc.subject.keywordPlus COMMUNICATION -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.