Paper 2023/1917
Regularized PolyKervNets: Optimizing Expressiveness and Efficiency for Private Inference in Deep Neural Networks
Abstract
Private computation of nonlinear functions, such as Rectified Linear Units (ReLUs) and max-pooling operations, in deep neural networks (DNNs) poses significant challenges in terms of storage, bandwidth, and time consumption. To address these challenges, there has been a growing interest in utilizing privacy-preserving techniques that leverage polynomial activation functions and kernelized convolutions as alternatives to traditional ReLUs. However, these alternative approaches often suffer from a trade-off between achieving faster private inference (PI) and sacrificing model accuracy. In particular, when applied to much deeper networks, these methods encounter training instabilities, leading to issues like exploding gradients (resulting in NaNs) or suboptimal approximations. In this study, we focus on PolyKervNets, a technique known for offering improved dynamic approximations in smaller networks but still facing instabilities in larger and more complex networks. Our primary objective is to empirically explore optimization-based training recipes to enhance the performance of PolyKervNets in larger networks. By doing so, we aim to potentially eliminate the need for traditional nonlinear activation functions, thereby advancing the state-of-the-art in privacy-preserving deep neural network architectures.
Metadata
- Available format(s)
- Category
- Applications
- Publication info
- Preprint.
- Keywords
- Privacy Preserving Machine LearningPolynomial ApproximationsDeep Neural NetworksHomomorphic EncryptionMPC
- Contact author(s)
- toluwani aremu @ mbzuai ac ae
- History
- 2023-12-19: revised
- 2023-12-14: received
- See all versions
- Short URL
- https://ia.cr/2023/1917
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2023/1917, author = {Toluwani Aremu}, title = {Regularized {PolyKervNets}: Optimizing Expressiveness and Efficiency for Private Inference in Deep Neural Networks}, howpublished = {Cryptology {ePrint} Archive, Paper 2023/1917}, year = {2023}, url = {https://eprint.iacr.org/2023/1917} }