Paper 2020/961

Enable Dynamic Parameters Combination to Boost Linear Convolutional Neural Network for Sensitive Data Inference

Qizheng Wang, Wenping Ma, Jie Li, and Ge Liu

Abstract

As cloud computing matures, Machine Learning as a Service(MLaaS) has received more attention. In many scenarios, sensitive information also has a demand for MLaaS, but it should not be exposed to others, which brings a dilemma. In order to solve this dilemma, many works have proposed some privacy-protected machine learning frameworks. Compared with plain-text tasks, cipher-text inference has higher computation and communication overhead. In addition to the difficulties caused by cipher-text calculations, the nonlinear activation functions in machine learning models are not friendly to Homomorphic Encryption(HE) and Secure Multi-Party Computation(MPC). The nonlinear activation function can effectively improve the performance of the network, and it seems that the high overhead brought by it is inevitable. In order to solve this problem, this paper re-explains the mechanism of the nonlinear activation function in forward propagation from another perspective, and based on this observation, proposed a dynamic parameters combination scheme as an alternative, called DPC. DPC allows the decoupling of nonlinear operations and linear operations in neural networks. This work further uses this feature to design the HE-based framework and MPC-based framework, so that non-linear operations can be completed locally by the user through pre-computation, which greatly improves the efficiency of privacy protection data prediction. The evaluation result shows that the linear neural networks with DPC can perform high accuracy. Without other optimizations, the HE-based proposed in this work shows 2x faster executions than CryptoNets only relying on the advantage of the DPC. The MPC-based framework proposed in this work can achieve similar efficiency to plain-text prediction, and has advantages over other work in terms of communication complexity and computational complexity.

Note: This work reinterprets the mechanism of the nonlinear activation function in the forward propagation of the neural network, and proposes a dual network model structure that decouples linear and nonlinear operations. This model is suitable for outsourced prediction tasks of sensitive data, and can be flexibly combined with secure multi-party computing protocols and homomorphic encryption schemes. Based on MPC and HE, we respectively propose a prediction framework for privacy-protected data. Among them, the MPC-based framework only uses basic additive secret sharing. We use some trick to make the efficiency of the oblivious inference task reach the same order of magnitude as the efficiency of plaintext inference.

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Preprint. MINOR revision.
Keywords
Cloud ComputingMachine LearningPrivacy ProtectionActivation Function.
Contact author(s)
574370699 @ qq com
History
2020-08-11: received
Short URL
https://ia.cr/2020/961
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2020/961,
      author = {Qizheng Wang and Wenping Ma and Jie Li and Ge Liu},
      title = {Enable Dynamic Parameters Combination to Boost Linear Convolutional Neural Network for Sensitive Data Inference},
      howpublished = {Cryptology {ePrint} Archive, Paper 2020/961},
      year = {2020},
      url = {https://eprint.iacr.org/2020/961}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.