File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

김성필

Kim, Sung-Phil
Brain-Computer Interface Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Decoding upper limb kinematic parameters from motor cortical activity using a deep learning algorithm

Author(s)
Park, JisungKim, Sung-Phil
Issued Date
2018-11-12
DOI
10.23919/APSIPA.2018.8659760
URI
https://scholarworks.unist.ac.kr/handle/201301/80461
Fulltext
https://ieeexplore.ieee.org/document/8659760
Citation
10th Asia-Pacific Signal and Information Processing Association Annual Summit and Conference, APSIPA ASC 2018, pp.478 - 482
Abstract
The current neural interface technology opens a new window for collecting multi-site streams of the firing activities of hundreds of neurons simultaneously. It provokes a need to equip with the means to harness as much information as possible from such huge amounts of neural data. Deep learning algorithms capable of representing latent components in large data may offer a means to extract useful information related to specific behavioral functions from these neural data. In this study, we aimed to decode movement-related information from motor cortical neuronal ensembles in a primate while the animal moved the arm and hand to perform an eight-target center-out task. The previous studies addressed the problem by decoding the velocity parameter to reconstruct arm-movement trajectories. However, as velocity can be decomposed into speed and direction, it may be advantageous to decode each parameter independently. Thus, we decoded speed and direction of the hand separately with the long short-term memory network (LSTM) to from the ensemble of one hundred fifty-eight primary motor cortical neurons. A comparison of the suggested LSTM decoder with traditional decoders directly predicting the velocity parameter using the linear Kalman filter or LSTM demonstrated a significant increase in the performance of reconstructing 2D hand trajectory. Our results may add accumulating evidence to the employment of deep learning algorithms for intracortical brain-machine interfaces and suggest that speed and direction can be decoded independently.
Publisher
Institute of Electrical and Electronics Engineers Inc.
ISSN
0000-0000

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.