File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

윤성환

Yoon, Sung Whan
Machine Intelligence and Information Learning Lab.
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Rethinking the Flat Minima Searching in Federated Learning

Author(s)
Lee, TaehwanYoon, Sung Whan
Issued Date
2024-07-21
URI
https://scholarworks.unist.ac.kr/handle/201301/82919
Citation
IEEE International Conference on Machine Learning
Abstract
Albeit the success of federated learning (FL) in decentralized training, bolstering the generalization of models by overcoming heterogeneity across clients still remains a huge challenge. To aim at improved generalization of FL, a group of recent works pursues flatter minima of models by employing sharpness-aware minimization in the local training at the client side. However, we observe that the global model, i.e., the aggregated model, does not lie on flat minima of the global objective, even with the effort of flatness searching in local training, which we define as flatness discrepancy. By rethinking and theoretically analyzing flatness searching in FL through the lens of the discrepancy problem, we propose a method called Federated Learning for Global Flatness (FedGF) that explicitly pursues the flatter minima of the global models, leading to the relieved flatness discrepancy and remarkable performance gains in the heterogeneous FL benchmarks.
Publisher
International Machine Learning Society (IMLS)

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.