File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

권오상

Kwon, Oh-Sang
Perception, Action, & Learning Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Decoding Imagined Musical Pitch From Human Scalp Electroencephalograms

Author(s)
Chung, MiyoungKim, TaehyungJeong, EunjuChung, Chun KeeKim, June SicKwon, Oh-SangKim, Sung-Phil
Issued Date
2023-05
DOI
10.1109/TNSRE.2023.3270175
URI
https://scholarworks.unist.ac.kr/handle/201301/64805
Citation
IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, v.31, pp.2154 - 2163
Abstract
Brain-computer interfaces (BCIs) can restore impaired cognitive functions in people with neurological disorders such as stroke. Musical ability is a cognitive function that is correlated with non-musical cognitive functions, and restoring it can enhance other cognitive functions. Pitch sense is the most relevant function to musical ability according to previous studies of amusia, and thus decoding pitch information is crucial for BCIs to be able to restore musical ability. This study evaluated the feasibility of decoding pitch imagery information directly from human electroencephalography (EEG). Twenty participants performed a random imagery task with seven musical pitches (C4-B4). We used two approaches to explore EEG features of pitch imagery: multiband spectral power at individual channels (IC) and differences between bilaterally symmetric channels (DC). The selected spectral power features revealed remarkable contrasts between left and right hemispheres, low- (<13 Hz) and high-frequency (> 13 Hz) bands, and frontal and parietal areas. We classified two EEG feature sets, IC and DC, into seven pitch classes using five types of classifiers. The best classification performance for seven pitches was obtained using IC and multiclass Support Vector Machine with an average accuracy of 35.68 +/- 7.47% (max. 50%) and an information transfer rate (ITR) of 0.37 +/- 0.22 bits/sec. When grouping the pitches to vary the number of classes (K = 2-6), the ITR was similar across K and feature sets, suggesting the efficiency of DC. This study demonstrates for the first time the feasibility of decoding imagined musical pitch directly from human EEG.
Publisher
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
ISSN
1534-4320
Keyword (Author)
Decodingmusic brain-computer interfacemusical pitchEEGspectral feature
Keyword
CONGENITAL AMUSIAPERCEPTIONIMAGERYBRAINDISCRIMINATIONDYNAMICSFEEDBACKSPEECHMEMORY

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.