There are no files associated with this item.
Cited time in
Full metadata record
| DC Field | Value | Language |
|---|---|---|
| dc.citation.startPage | 1606801 | - |
| dc.citation.title | FRONTIERS IN NEUROSCIENCE | - |
| dc.citation.volume | 19 | - |
| dc.contributor.author | Lee, Eunji | - |
| dc.contributor.author | Kim, Ji-Hyun | - |
| dc.contributor.author | Park, Jaeseok | - |
| dc.contributor.author | Kim, Sung-Phil | - |
| dc.contributor.author | Shin, Taehoon | - |
| dc.date.accessioned | 2025-07-18T14:00:09Z | - |
| dc.date.available | 2025-07-18T14:00:09Z | - |
| dc.date.created | 2025-07-16 | - |
| dc.date.issued | 2025-06 | - |
| dc.description.abstract | Introduction Aristotle illusion is a well-known tactile illusion which causes the perception of one object as two. EEG analysis was employed to investigate the neural correlates of Aristotle illusion, yet was limited due to low spatial resolution of EEG. This study aimed to identify brain regions involved in the Aristotle illusion using functional magnetic resonance imaging (fMRI) and deep learning-based analysis of fMRI data.Methods While three types of tactile stimuli (Aristotle, Reverse, Asynchronous) were applied to thirty participants' fingers, we collected fMRI data, and recorded the number of stimuli each participant perceived. Four convolutional neural network (CNN) models were trained for perception-based classification tasks (the occurrence of Aristotle illusion vs. Reverse illusion, the occurrence vs. absence of Reverse illusion), and stimulus-based classification tasks (Aristotle vs. Reverse, Reverse vs. Asynchronous, and Aristotle vs. Asynchronous).Results Simple fully convolution network (SFCN) achieved the highest classification accuracy of 68.4% for the occurrence of Aristotle illusion vs. Reverse illusion, and 80.1% for the occurrence vs. absence of Reverse illusion. For stimulus-based classification tasks, all CNN models yielded accuracies around 50% failing to distinguish among the three types of applied stimuli. Gradient-weighted class activation mapping (Grad-CAM) analysis revealed salient brain regions-of-interest (ROIs) for the perception-based classification tasks, including the somatosensory cortex and parietal regions.Discussion Our findings demonstrate that perception-driven neural responses are classifiable using fMRI-based CNN models. Saliency analysis of the trained CNNs reveals the involvement of the somatosensory cortex and parietal regions in making classification decisions, consistent with previous research. Other salient ROIs include orbitofrontal cortex, middle temporal pole, supplementary motor area, and middle cingulate cortex. | - |
| dc.identifier.bibliographicCitation | FRONTIERS IN NEUROSCIENCE, v.19, pp.1606801 | - |
| dc.identifier.doi | 10.3389/fnins.2025.1606801 | - |
| dc.identifier.issn | 1662-4548 | - |
| dc.identifier.scopusid | 2-s2.0-105009858157 | - |
| dc.identifier.uri | https://scholarworks.unist.ac.kr/handle/201301/87462 | - |
| dc.identifier.wosid | 001521770100001 | - |
| dc.language | 영어 | - |
| dc.publisher | FRONTIERS MEDIA SA | - |
| dc.title | Neural decoding of Aristotle tactile illusion using deep learning-based fMRI classification | - |
| dc.type | Article | - |
| dc.description.isOpenAccess | FALSE | - |
| dc.relation.journalWebOfScienceCategory | Neurosciences | - |
| dc.relation.journalResearchArea | Neurosciences & Neurology | - |
| dc.type.docType | Article | - |
| dc.description.journalRegisteredClass | scie | - |
| dc.description.journalRegisteredClass | scopus | - |
| dc.subject.keywordAuthor | fMRI | - |
| dc.subject.keywordAuthor | deep learning | - |
| dc.subject.keywordAuthor | brain mapping | - |
| dc.subject.keywordAuthor | somatosensory | - |
| dc.subject.keywordAuthor | tactile illusion | - |
| dc.subject.keywordPlus | INFORMATION | - |
Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.
Tel : 052-217-1403 / Email : scholarworks@unist.ac.kr
Copyright (c) 2023 by UNIST LIBRARY. All rights reserved.
ScholarWorks@UNIST was established as an OAK Project for the National Library of Korea.