File Download

There are no files associated with this item.

  • Find it @ UNIST can give you direct access to the published full text of this article. (UNISTARs only)
Related Researcher

박새롬

Park, Saerom
Read More

Views & Downloads

Detailed Information

Cited time in webofscience Cited time in scopus
Metadata Downloads

Privacy-Preserving Fair Learning of Support Vector Machine with Homomorphic Encryption

Author(s)
Park, SaeromByun, JunyoungLee, Joohee
Issued Date
2022-04-25
DOI
10.1145/3485447.3512252
URI
https://scholarworks.unist.ac.kr/handle/201301/76135
Citation
International World Wide Web Conference, pp.3572 - 3583
Abstract
Fair learning has received a lot of attention in recent years since machine learning models can be unfair in automated decision-making systems with respect to sensitive attributes such as gender, race, etc. However, to mitigate the discrimination on the sensitive attributes and train a fair model, most fair learning methods have required to get access to the sensitive attributes in training or validation phases. In this study, we propose a privacy-preserving training algorithm for a fair support vector machine classifier based on Homomorphic Encryption (HE), where the privacy of both sensitive information and model secrecy can be preserved. The expensive computational costs of HE can be significantly improved by protecting only the sensitive information, introducing refined formulation and low-rank approximation using shared eigenvectors. Through experiments on the synthetic and real-world data, we demonstrate the effectiveness of our algorithm in terms of accuracy and fairness and show that our method significantly outperforms other privacy-preserving solutions in terms of better trade-offs between accuracy and fairness. To the best of our knowledge, our algorithm is the first privacy-preserving fair learning algorithm using HE.
Publisher
Association for Computing Machinery, Inc

qrcode

Items in Repository are protected by copyright, with all rights reserved, unless otherwise indicated.