File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

김영대

Kim, Youngdae
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace ZZ -
dc.citation.endPage 1083 -
dc.citation.startPage 1074 -
dc.citation.title 36th IEEE International Parallel and Distributed Processing Symposium Workshops -
dc.contributor.author Ryu, M. -
dc.contributor.author Kim, Youngdae -
dc.contributor.author Kim, K. -
dc.contributor.author Madduri, R.K. -
dc.date.accessioned 2024-08-09T14:35:07Z -
dc.date.available 2024-08-09T14:35:07Z -
dc.date.created 2024-08-09 -
dc.date.issued 2022-05-30 -
dc.description.abstract Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in clas-sical machine learning. The FL capability is especially important to domains such as biomedicine and smart grid, where data may not be shared freely or stored at a central location because of policy regulations. Thanks to the capability of learning from decentralized datasets, FL is now a rapidly growing research field, and numerous FL frameworks have been developed. In this work we introduce APPFL, the Argonne Privacy-Preserving Federated Learning framework. APPFL allows users to leverage implemented privacy-preserving algorithms, implement new al-gorithms, and simulate and deploy various FL algorithms with privacy-preserving techniques. The modular framework enables users to customize the components for algorithms, privacy, communication protocols, neural network models, and user data. We also present a new communication-efficient algorithm based on an inexact alternating direction method of multipliers. The algorithm requires significantly less communication between the server and the clients than does the current state of the art. We demonstrate the computational capabilities of APPFL, including differentially private FL on various test datasets and its scalability, by using multiple algorithms and datasets on different computing environments. © 2022 IEEE. -
dc.identifier.bibliographicCitation 36th IEEE International Parallel and Distributed Processing Symposium Workshops, pp.1074 - 1083 -
dc.identifier.doi 10.1109/IPDPSW55747.2022.00175 -
dc.identifier.scopusid 2-s2.0-85136226636 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/83435 -
dc.language 영어 -
dc.publisher Institute of Electrical and Electronics Engineers Inc. -
dc.title APPFL: Open-Source Software Framework for Privacy-Preserving Federated Learning -
dc.type Conference Paper -
dc.date.conferenceDate 2022-05-30 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.