File Download

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

임민혁

Lim, Min Hyuk
Intelligence and Control-based BioMedicine Lab
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Continual learning framework for a multicenter study with an application to electrocardiogram

Author(s)
Kim, JunmoLim, Min HyukKim, KwangsooYoon, Hyung-Jin
Issued Date
2024-03
DOI
10.1186/s12911-024-02464-9
URI
https://scholarworks.unist.ac.kr/handle/201301/81934
Citation
BMC MEDICAL INFORMATICS AND DECISION MAKING, v.24, no.1, pp.67
Abstract
Deep learning has been increasingly utilized in the medical field and achieved many goals. Since the size of data dominates the performance of deep learning, several medical institutions are conducting joint research to obtain as much data as possible. However, sharing data is usually prohibited owing to the risk of privacy invasion. Federated learning is a reasonable idea to train distributed multicenter data without direct access; however, a central server to merge and distribute models is needed, which is expensive and hardly approved due to various legal regulations. This paper proposes a continual learning framework for a multicenter study, which does not require a central server and can prevent catastrophic forgetting of previously trained knowledge. The proposed framework contains the continual learning method selection process, assuming that a single method is not omnipotent for all involved datasets in a real-world setting and that there could be a proper method to be selected for specific data. We utilized the fake data based on a generative adversarial network to evaluate methods prospectively, not ex post facto. We used four independent electrocardiogram datasets for a multicenter study and trained the arrhythmia detection model. Our proposed framework was evaluated against supervised and federated learning methods, as well as finetuning approaches that do not include any regulation to preserve previous knowledge. Even without a central server and access to the past data, our framework achieved stable performance (AUROC 0.897) across all involved datasets, achieving comparable performance to federated learning (AUROC 0.901).
Publisher
BMC
ISSN
1472-6947
Keyword (Author)
Multicenter studyDeep learningContinual learningElectrocardiogram
Keyword
ECGMECHANISMSNETWORKS

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.