File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Federated Learning with Sufficient Dimension Reduction and Generative Models

Alternative Title
충분 차원 축소 및 생성 모델 기반의 연합 학습
Author(s)
Hahn, Seok-Ju
Advisor
Lee, Junghye
Issued Date
2021-02
URI
https://scholarworks.unist.ac.kr/handle/201301/82453 http://unist.dcollection.net/common/orgView/200000370694
Abstract
Federated learning is gaining popularity as data are massively generated in a distributed manner. One of the major benefits is to mitigate the privacy risks as the learning of algorithms can be achieved without collecting or sharing data. While federated learning has shown great promise mainly based on the stochastic gradient based optimization, there are still many challenging problems in protecting privacy, especially during the process of gradients update and exchange for a federated optimization. This paper presents the first gradient-free federated learning framework called GRAFFL for learning many distributed models simultaneously to learn a population distribution of partitioned data without moving of it by leveraging deep generative models. Unlike conventional federated learning algorithms based on exchanging parameters or gradients generated as a byproduct of local updating of the shared model, our framework does not require to disassemble a model (i.e., to linear components) or to perturb data (or encryption of data for aggregation) for preserving privacy leakage. Instead, this framework uses implicit information derived from each participating client to generate sufficient summary statistics of its paired samples. They are derived from NSDR network that is a neural network developed in this study to create reduced representations reduced in dimension and containing statistically sufficient information, thereby protecting sensitive information from leakage. By introducing squared-loss mutual information term as an objective and enforcing parameters to be on the Stiefel manifold, this is proved to provide sufficient summary statistics. Generator model in the central server acts as a distributed sufficient summary statistics aggregator without explicit move of it. Using several datasets, the feasibility and usefulness of proposed framework in terms of privacy protection and prediction performance is demonstrated.
Publisher
Ulsan National Institute of Science and Technology (UNIST)
Degree
Master
Major
Department of Industrial Engineering

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.