Paper 2023/073
FssNN: Communication-Efficient Secure Neural Network Training via Function Secret Sharing
Abstract
Privacy-preserving neural network enables multiple parties to jointly train neural network models without revealing sensitive data. However, its practicality is greatly limited due to the low efficiency caused by massive communication costs and a deep dependence on a trusted dealer. In this paper, we propose a communication-efficient secure two-party neural network framework, FssNN, to enable practical secure neural network training and inference. In FssNN, we reduce communication costs of computing neural network operators in the online and offline phases by combining additive secret sharing (SS) and function secret sharing (FSS), and eliminate the dependence on the trusted dealer based on a distributed key generation scheme. First, by integrating correction words and designing a more compact key generation algorithm, we propose a key-reduced distributed comparison function (DCF, a FSS scheme for comparison functions) with the smallest key sizes to enable efficient computation of non-linear layer functions in the offline phase. Secondly, by leveraging the proposed DCF and combining SS and FSS, we construct online-efficient computation protocols for neural network operators, such as Hadamard product, ReLU and DReLU, and reduce the online communication costs to about $1/2$ of that of the state-of-the-art solution. Finally, by utilizing MPC-friendly pseudorandom generators, we propose a distributed DCF key generation scheme to replace the trusted dealer and support a larger input domain than the state-of-the-art solution. Using FssNN, we perform extensive secure training and inference evaluations on various neural network models. Compared with the state-of-the-art solution AriaNN (PoPETs'22), we reduce the communication costs of secure training and inference by approximately $25.4 \%$ and $26.4 \%$ respectively, while keeping the accuracy of privacy-preserving training and inference close to that of plaintext training and inference.
Note: This is an updated version with more comparisons with other works and more experiment results.
Metadata
- Available format(s)
- Category
- Cryptographic protocols
- Publication info
- Preprint.
- Keywords
- Privacy-preserving neural networkSecure multi-party computationAdditive secret sharingFunction secret sharing
- Contact author(s)
-
stuyangpeng @ stu hit edu cn
zoeljiang @ hit edu cn - History
- 2023-10-12: last of 2 revisions
- 2023-01-22: received
- See all versions
- Short URL
- https://ia.cr/2023/073
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2023/073, author = {Peng Yang and Zoe L. Jiang and Shiqi Gao and Jiehang Zhuang and Hongxiao Wang and Junbin Fang and Siuming Yiu and Yulin Wu and Xuan Wang}, title = {FssNN: Communication-Efficient Secure Neural Network Training via Function Secret Sharing}, howpublished = {Cryptology ePrint Archive, Paper 2023/073}, year = {2023}, note = {\url{https://eprint.iacr.org/2023/073}}, url = {https://eprint.iacr.org/2023/073} }