Paper 2024/942

Let Them Drop: Scalable and Efficient Federated Learning Solutions Agnostic to Client Stragglers

Riccardo Taiello, Inria Sophia Antipolis - Méditerranée, EURECOM, Université Côte d’Azur
Melek Önen, EURECOM
Clémentine Gritti, EURECOM, INSA Lyon
Marco Lorenzi, Inria Sophia Antipolis - Méditerranée, Université Côte d’Azur
Abstract

Secure Aggregation (SA) stands as a crucial component in modern Federated Learning (FL) systems, facilitating collaborative training of a global machine learning model while protecting the privacy of individual clients' local datasets. Many existing SA protocols described in the FL literature operate synchronously, leading to notable runtime slowdowns due to the presence of stragglers (i.e. late-arriving clients). To address this challenge, one common approach is to consider stragglers as client failures and use SA solutions that are robust against dropouts. While this approach indeed seems to work, it unfortunately affects the performance of the protocol as its cost strongly depends on the dropout ratio and this ratio has increased significantly when taking stragglers into account. Another approach explored in the literature to address stragglers is to introduce asynchronicity into the FL system. Very few SA solutions exist in this setting and currently suffer from high overhead. In this paper, similar to related work, we propose to handle stragglers as client failures but design SA solutions that do not depend on the dropout ratio so that an unavoidable increase on this metric does not affect the performance of the solution. We first introduce Eagle, a synchronous SA scheme designed not to depend on the client failures but on the online users' inputs only. This approach offers better computation and communication costs compared to existing solutions under realistic settings where the number of stragglers is high. We then propose Owl, the first SA solution that is suitable for the asynchronous setting and once again considers online clients' contributions only. We implement both solutions and show that: (i) in a synchronous FL with realistic dropout rates (taking potential stragglers into account), Eagle outperforms the best SA solution, namely Flamingo, by X4; (ii) In the asynchronous setting, Owl exhibits the best performance compared to the state-of-the-art solution LightSecAgg.

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Preprint.
Keywords
Security and privacyPrivacy-preserving protocolsFederated Learning
Contact author(s)
riccardo taiello @ inria fr
melek onen @ eurecom fr
clementine gritti @ eurecom fr
marco lorenzi @ inria fr
History
2024-06-13: approved
2024-06-12: received
See all versions
Short URL
https://ia.cr/2024/942
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2024/942,
      author = {Riccardo Taiello and Melek Önen and Clémentine Gritti and Marco Lorenzi},
      title = {Let Them Drop: Scalable and Efficient Federated Learning Solutions Agnostic to Client Stragglers},
      howpublished = {Cryptology ePrint Archive, Paper 2024/942},
      year = {2024},
      note = {\url{https://eprint.iacr.org/2024/942}},
      url = {https://eprint.iacr.org/2024/942}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.