Paper 2023/073
FssNN: Communication-Efficient Secure Neural Network Training via Function Secret Sharing
Abstract
This Paper proposes FssNN, a communication-efficient secure two-party computation framework for evaluating privacy-preserving neural network via function secret sharing (FSS) in semi-honest adversary setting. In FssNN, two parties with input data in secret sharing form perform secure linear computations using additive secret haring and non-linear computations using FSS, and obtain secret shares of model parameters without disclosing their input data. To decrease communication cost, we split the protocol into online and offline phases where input-independent correlated randomness is generated in offline phase while only lightweight ``non-cryptographic'' computations are executed in online phase. Specifically, we propose $\mathsf{BitXA}$ to reduce online communication in linear computation, DCF to reduce key size of the FSS scheme used in offline phase for nonlinear computation. To further support neural network training, we enlarge the input size of neural network to $2^{32}$ via ``MPC-friendly'' PRG. We implement the framework in Python and evaluate the end-to-end system for private training between two parties on standard neural networks. FssNN achieves on MNIST dataset an accuracy of 98.0%, with communication cost of 27.52GB and runtime of 0.23h per epoch in the LAN settings. That shows our work advances the state-of-the-art secure computation protocol for neural networks.
Metadata
- Available format(s)
-
PDF
- Category
- Cryptographic protocols
- Publication info
- Preprint.
- Keywords
- Privacy-preserving neural networkSecure multi-party computationAdditive secret sharingFunction secret sharing
- Contact author(s)
-
stuyangpeng @ stu hit edu cn
zoeljiang @ hit edu cn
200111514 @ stu hit edu cn
22S051007 @ stu hit edu cn
hxwang @ cs hku hk
tjunbinfang @ jnu edu cn
smyiu @ cs hku hk
wuyulin @ hit edu cn - History
- 2023-01-23: approved
- 2023-01-22: received
- See all versions
- Short URL
- https://ia.cr/2023/073
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2023/073, author = {Peng Yang and Zoe L. Jiang and Shiqi Gao and Jiehang Zhuang and Hongxiao Wang and Junbin Fang and Siuming Yiu and Yulin Wu}, title = {FssNN: Communication-Efficient Secure Neural Network Training via Function Secret Sharing}, howpublished = {Cryptology ePrint Archive, Paper 2023/073}, year = {2023}, note = {\url{https://eprint.iacr.org/2023/073}}, url = {https://eprint.iacr.org/2023/073} }