You are looking at a specific version 20210210:073712 of this paper. See the latest version.

Paper 2021/142

Federated Learning with Local Differential Privacy: Trade-offs between Privacy, Utility, and Communication

Muah Kim and Onur Gunlu and Rafael F. Schaefer

Abstract

Federated learning (FL) allows to train a massive amount of data privately due to its decentralized structure. Stochastic gradient descent (SGD) is commonly used for FL due to its good empirical performance, but sensitive user information can still be inferred from weight updates shared during FL iterations. We consider Gaussian mechanisms to preserve local differential privacy (LDP) of user data in the FL model with SGD. The trade-offs between user privacy, global utility, and transmission rate are proved by defining appropriate metrics for FL with LDP. Compared to existing results, the query sensitivity used in LDP is defined as a variable and a tighter privacy accounting method is applied. The proposed utility bound allows heterogeneous parameters over all users. Our bounds characterize how much utility decreases and transmission rate increases if a stronger privacy regime is targeted. Furthermore, given a target privacy level, our results guarantee a significantly larger utility and a smaller transmission rate as compared to existing privacy accounting methods.

Metadata
Available format(s)
PDF
Category
Foundations
Publication info
Preprint. MINOR revision.
Keywords
federated learningdifferential privacyFedSGDprivacy accountingprivacy-utility tradeoff
Contact author(s)
guenlue @ tu-berlin de
History
2021-02-10: received
Short URL
https://ia.cr/2021/142
License
Creative Commons Attribution
CC BY
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.