Cryptology ePrint Archive: Report 2021/1688

Low-Complexity Deep Convolutional Neural Networks on Fully Homomorphic Encryption Using Multiplexed Parallel Convolutions

Eunsang Lee and Joon-Woo Lee and Junghyun Lee and Young-Sik Kim and Yongjune Kim and Jong-Seon No and Woosuk Choi

Abstract: Recently, the standard ResNet-20 network was successfully implemented on residue number system variant Cheon-Kim-Kim-Song (RNS-CKKS) scheme using bootstrapping, but the implementation lacks practicality due to high latency and low security level. To improve the performance, we first minimize total bootstrapping runtime using multiplexed parallel convolution that collects sparse output data for multiple channels compactly. We also propose the \emph{imaginary-removing bootstrapping} to prevent the deep neural networks from catastrophic divergence during approximate ReLU operations. In addition, we optimize level consumptions and use lighter and tighter parameters. Simulation results show that we have 4.67$\times$ lower inference latency and 134$\times$ less amortized runtime (runtime per image) for ResNet-20 compared to the state-of-the-art previous work, and we achieve standard 128-bit security. Furthermore, we successfully implement ResNet-110 with high accuracy on the RNS-CKKS scheme for the first time.

Category / Keywords: applications / Artificial intelligence, Cheon-Kim-Kim-Song (CKKS), Convolution, Fully homomorphic encryption (FHE), Privacy-preserving machine learning (PPML), Residue number system variant Cheon-Kim-Kim-Song (RNS-CKKS), ResNet model

Date: received 23 Dec 2021, last revised 10 Feb 2022

Contact author: eslee3209 at ccl snu ac kr, shaeunsang at snu ac kr, jsno at snu ac kr, iamyskim at Chosun ac kr, yjk at dgist ac kr, joonwoo42 at snu ac kr, jhlee at ccl snu ac kr, woosuk0 choi at samsung com

Available format(s): PDF | BibTeX Citation

Version: 20220210:055942 (All versions of this report)

Short URL:

[ Cryptology ePrint archive ]