File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

김성필

Kim, Sung-Phil
Brain-Computer Interface Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.endPage 2036 -
dc.citation.startPage 2027 -
dc.citation.title IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING -
dc.citation.volume 29 -
dc.contributor.author Kim, Min-Ki -
dc.contributor.author Sohn, Jeong-Woo -
dc.contributor.author Kim, Sung-Phil -
dc.date.accessioned 2023-12-21T15:14:21Z -
dc.date.available 2023-12-21T15:14:21Z -
dc.date.created 2021-11-02 -
dc.date.issued 2021-09 -
dc.description.abstract While intracortical brain-machine interfaces (BMIs) demonstrate feasibility to restore mobility to people with paralysis, it is still challenging to maintain high-performance decoding in clinical BMIs. One of the main obstacles for high-performance BMI is the noise-prone nature of traditional decoding methods that connect neural response explicitly with physical quantity, such as velocity. In contrast, the recent development of latent neural state model enables a robust readout of large-scale neuronal population activity contents. However, these latent neural states do not necessarily contain kinematic information useful for decoding. Therefore, this study proposes a new approach to finding kinematics-dependent latent factors by extracting latent factors' kinematics-dependent components using linear regression. We estimated these components from the population activity through nonlinear mapping. The proposed kinematics-dependent latent factors generate neural trajectories that discriminate latent neural states before and after the motion onset. We compared the decoding performance of the proposed analysis model with the results from other popular models. They are factor analysis (FA), Gaussian process factor analysis (GPFA), latent factor analysis via dynamical systems (LFADS), preferential subspace identification (PSID), and neuronal population firing rates. The proposed analysis model results in higher decoding accuracy than do the others (>17% improvement on average). Our approach may pave a new way to extract latent neural states specific to kinematic information from motor cortices, potentially improving decoding performance for online intracortical BMIs. -
dc.identifier.bibliographicCitation IEEE TRANSACTIONS ON NEURAL SYSTEMS AND REHABILITATION ENGINEERING, v.29, pp.2027 - 2036 -
dc.identifier.doi 10.1109/TNSRE.2021.3114367 -
dc.identifier.issn 1534-4320 -
dc.identifier.scopusid 2-s2.0-85115675561 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/54763 -
dc.identifier.url https://ieeexplore.ieee.org/document/9543674 -
dc.identifier.wosid 000704108300002 -
dc.language 영어 -
dc.publisher IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC -
dc.title Finding Kinematics-Driven Latent Neural States From Neuronal Population Activity for Motor Decoding -
dc.type Article -
dc.description.isOpenAccess TRUE -
dc.relation.journalWebOfScienceCategory Engineering, Biomedical; Rehabilitation -
dc.relation.journalResearchArea Engineering; Rehabilitation -
dc.type.docType Article -
dc.description.journalRegisteredClass scie -
dc.description.journalRegisteredClass scopus -
dc.subject.keywordAuthor Kinematics-dependent latent factor -
dc.subject.keywordAuthor motor decoding -
dc.subject.keywordAuthor intracortical brain--machine interface -
dc.subject.keywordAuthor neural trajectory -
dc.subject.keywordAuthor factor analysis -
dc.subject.keywordAuthor Kinematics -
dc.subject.keywordAuthor Statistics -
dc.subject.keywordAuthor Sociology -
dc.subject.keywordAuthor Decoding -
dc.subject.keywordAuthor Neurons -
dc.subject.keywordAuthor Noise measurement -
dc.subject.keywordAuthor Firing -
dc.subject.keywordPlus MOVEMENT -
dc.subject.keywordPlus DYNAMICS -
dc.subject.keywordPlus CORTEX -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.