File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

이슬기

Lee, Seulki
Embedded Artificial Intelligence Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Full metadata record

DC Field Value Language
dc.citation.conferencePlace SP -
dc.citation.endPage 3429 -
dc.citation.startPage 3421 -
dc.citation.title International Conference on Artificial Intelligence and Statistics -
dc.contributor.author Jeong, Isu -
dc.contributor.author Lee, Seulki -
dc.date.accessioned 2024-06-20T16:05:08Z -
dc.date.available 2024-06-20T16:05:08Z -
dc.date.created 2024-06-17 -
dc.date.issued 2024-05-03 -
dc.description.abstract We introduce On-Demand Federated Learning (On-Demand FL), which enables on-demand federated learning of a deep model for an arbitrary target data distribution of interest by making the best use of the heterogeneity (non-IID-ness) of local client data, unlike existing approaches trying to circumvent the non-IID nature of federated learning. On-Demand FL composes a dataset of the target distribution, which we call the composite dataset, from a selected subset of local clients whose aggregate distribution is expected to emulate the target distribution as a whole. As the composite dataset consists of a precise yet diverse subset of clients reflecting the target distribution, the on-demand model trained with exactly enough selected clients becomes able to improve the model performance on the target distribution compared when trained with off-target and/or unknown distributions while reducing the number of participating clients and federating rounds. We model the target data distribution in terms of class and estimate the class distribution of each local client from the weight gradient of its local model. Our experiment results show that On-Demand FL achieves up to 5% higher classification accuracy on various target distributions just involving 9× fewer clients with FashionMNIST, CIFAR-10, and CIFAR-100. -
dc.identifier.bibliographicCitation International Conference on Artificial Intelligence and Statistics, pp.3421 - 3429 -
dc.identifier.uri https://scholarworks.unist.ac.kr/handle/201301/83004 -
dc.identifier.url https://proceedings.mlr.press/v238/jeong24a.html -
dc.language 영어 -
dc.publisher ML Research Press -
dc.title On-Demand Federated Learning for Arbitrary Target Class Distributions -
dc.type Conference Paper -
dc.date.conferenceDate 2024-05-02 -

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.