Paper 2020/167
Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning
Jinhyun So, Basak Guler, and A. Salman Avestimehr
Abstract
Federated learning is gaining significant interests as it enables model training over a large volume of data that is distributedly stored over many users, while protecting the privacy of the individual users. However, a major bottleneck in scaling federated learning to a large number of users is the overhead of secure model aggregation across many users. In fact, the overhead of state-of-the-art protocols for secure model aggregation grows quadratically with the number of users. We propose a new scheme, named Turbo-Aggregate, that in a network with
Metadata
- Available format(s)
-
PDF
- Category
- Cryptographic protocols
- Publication info
- Published elsewhere. Minor revision. arXiv:2002.04156
- Keywords
- Federated Learningsecure aggregationprivacy-preserving machine learning
- Contact author(s)
- jinhyuns @ usc edu
- History
- 2020-05-24: last of 3 revisions
- 2020-02-13: received
- See all versions
- Short URL
- https://ia.cr/2020/167
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2020/167, author = {Jinhyun So and Basak Guler and A. Salman Avestimehr}, title = {Turbo-Aggregate: Breaking the Quadratic Aggregation Barrier in Secure Federated Learning}, howpublished = {Cryptology {ePrint} Archive, Paper 2020/167}, year = {2020}, url = {https://eprint.iacr.org/2020/167} }