Paper 2024/1833

Private Neural Network Training with Packed Secret Sharing

Hengcheng Zhou, Shanghai Jiao Tong University
Abstract

We present a novel approach for training neural networks that leverages packed Shamir secret sharing scheme. For specific training protocols based on Shamir scheme, we demonstrate how to realize the conversion between packed sharing and Shamir sharing without additional communication overhead. We begin by introducing a method to locally convert between Shamir sharings with secrets stored at different slots. Building upon this conversion, we achieve free conversion from packed sharing to Shamir sharing. We then show how to embed the conversion from Shamir sharing to packed sharing into the truncation used during the training process without incurring additional communication costs. With free conversion between packed sharing and Shamir sharing, we illustrate how to utilize the packed scheme to parallelize certain computational steps involved in neural network training. On this basis, we propose training protocols with information-theoretic security between general $n$ parties under the semi-honest model. The experimental results demonstrate that, compared to previous work in this domain, applying the packed scheme can effectively improve training efficiency. Specifically, when packing $4$ secrets into a single sharing, we observe a reduction of more than $20\%$ in communication overhead and an improvement of over $10\%$ in training speed under the WAN setting.

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Published elsewhere. COCOON 2024
Keywords
Secure multi-party computationPacked Shamir secret sharing schemeNeural network training
Contact author(s)
zhc12345 @ sjtu edu cn
History
2024-11-08: revised
2024-11-08: received
See all versions
Short URL
https://ia.cr/2024/1833
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2024/1833,
      author = {Hengcheng Zhou},
      title = {Private Neural Network Training with Packed Secret Sharing},
      howpublished = {Cryptology {ePrint} Archive, Paper 2024/1833},
      year = {2024},
      url = {https://eprint.iacr.org/2024/1833}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.