Paper 2023/467
Secure Floating-Point Training
Abstract
Secure 2-party computation (2PC) of floating-point arithmetic is improving in performance and recent work runs deep learning algorithms with it, while being as numerically precise as commonly used machine learning (ML) frameworks like PyTorch. We find that the existing 2PC libraries for floating-point support generic computations and lack specialized support for ML training. Hence, their latency and communication costs for compound operations (e.g., dot products) are high. We provide novel specialized 2PC protocols for compound operations and prove their precision using numerical analysis. Our implementation BEACON outperforms state-of-the-art libraries for 2PC of floating-point by over $6\times$.
Metadata
- Available format(s)
- Category
- Cryptographic protocols
- Publication info
- Published elsewhere. USENIX Security 2023
- Keywords
- secure two-party computationfloating-pointprivacy-preserving machine learningsecure training
- Contact author(s)
-
deevashwer @ berkeley edu
t-anweshb @ microsoft com
divya gupta @ microsoft com
rahsha @ microsoft com
dawnsong @ gmail com - History
- 2023-04-01: approved
- 2023-03-31: received
- See all versions
- Short URL
- https://ia.cr/2023/467
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2023/467, author = {Deevashwer Rathee and Anwesh Bhattacharya and Divya Gupta and Rahul Sharma and Dawn Song}, title = {Secure Floating-Point Training}, howpublished = {Cryptology {ePrint} Archive, Paper 2023/467}, year = {2023}, url = {https://eprint.iacr.org/2023/467} }