Paper 2024/1665

DMM: Distributed Matrix Mechanism for Differentially-Private Federated Learning using Packed Secret Sharing

Alexander Bienstock, J.P. Morgan AI Research & J.P. Morgan AlgoCRYPT CoE
Ujjwal Kumar, J.P. Morgan
Antigoni Polychroniadou, J.P. Morgan AI Research & J.P. Morgan AlgoCRYPT CoE
Abstract

Federated Learning (FL) has gained lots of traction recently, both in industry and academia. In FL, a machine learning model is trained using data from various end-users arranged in committees across several rounds. Since such data can often be sensitive, a primary challenge in FL is providing privacy while still retaining utility of the model. Differential Privacy (DP) has become the main measure of privacy in the FL setting. DP comes in two flavors: central and local. In the former, a centralized server is trusted to receive the users' raw gradients from a training step, and then perturb their aggregation with some noise before releasing the next version of the model. In the latter (more private) setting, noise is applied on users' local devices, and only the aggregation of users' noisy gradients is revealed even to the server. Great strides have been made in increasing the privacy-utility trade-off in the central DP setting, by utilizing the so-called \emph{matrix mechanism}. However, progress has been mostly stalled in the local DP setting. In this work, we introduce the \emph{distributed} matrix mechanism to achieve the best-of-both-worlds; local DP and also better privacy-utility trade-off from the matrix mechanism. We accomplish this by proposing a cryptographic protocol that securely transfers sensitive values across rounds, which makes use of \emph{packed secret sharing. This protocol accommodates the dynamic participation of users per training round required by FL, including those that may drop out from the computation. We provide experiments which show that our mechanism indeed significantly improves the privacy-utility trade-off of FL models compared to previous local DP mechanisms, with little added overhead.

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Preprint.
Keywords
Federated LearningDifferential PrivacyPacked Secret SharingMatrix Mechanism
Contact author(s)
abienstock @ cs nyu edu
ujjwal x2 kumar @ chase com
antigonipoly @ gmail com
History
2024-10-18: approved
2024-10-15: received
See all versions
Short URL
https://ia.cr/2024/1665
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2024/1665,
      author = {Alexander Bienstock and Ujjwal Kumar and Antigoni Polychroniadou},
      title = {{DMM}: Distributed Matrix Mechanism for Differentially-Private Federated Learning using Packed Secret Sharing},
      howpublished = {Cryptology {ePrint} Archive, Paper 2024/1665},
      year = {2024},
      url = {https://eprint.iacr.org/2024/1665}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.