Cryptology ePrint Archive: Report 2021/1688

Low-Complexity Deep Convolutional Neural Networks on Fully Homomorphic Encryption Using Multiplexed Convolutions

Eunsang Lee and Joon-Woo Lee and Junghyun Lee and Young-Sik Kim and Yongjune Kim and Jong-Seon No and Woosuk Choi

Abstract: Privacy-preserving machine learning on fully homomorphic encryption (FHE) is one of the most influential applications of the FHE scheme. Recently, Lee et al. [16] implemented the standard ResNet-20 model for the CIFAR-10 dataset with residue number system variant Cheon-Kim-Kim-Song (RNS-CKKS) scheme, one of the most promising FHE schemes, for the first time. However, its implementation should be improved because it requires large number of key-switching operations, which is the heaviest operation in the RNS-CKKS scheme. In order to reduce the number of key-switching operations, it should be studied to efficiently perform neural networks on the RNS-CKKS scheme utilizing full slots of RNS-CKKS ciphertext as much as possible. In particular, since the packing density is reduced to 1/4 whenever a convolution of stride two is performed, it is required to study convolution that maintains packing density of the data. On the other hand, when bootstrapping should be performed, it is desirable to use sparse slot bootstrapping that requires fewer key-switching operations instead of full slot bootstrapping. In this paper, we propose a new packing method that makes several tensors for multiple channels to be multiplexed into one tensor. Then, we propose new convolution method that outputs a multiplexed tensor for the input multiplexed tensor, which makes it possible to maintain a high packing density during the entire ResNet network with strided convolution. In addition, we propose a method that parallelly performs convolutions for multiple output channels using repeatedly packed input data, which reduces the running time of convolution. Further, we fine-tune the parameters to reach the standard 128-bit security level and to further reduce the number of the bootstrapping operations. As a result, the number of key-switching operations is reduced to 1/107 compared to Lee et al's implementation in the ResNet-20 model on the RNS-CKKS scheme. The proposed method takes about 37 minutes with only one thread for classification of one CIFAR-10 image compared to 3 hours with 64 threads of Lee et al.'s implementation. Furthermore, we even implement ResNet-32/44/56/110 models for the first time on RNS-CKKS scheme with the linear time of the number of layers, which is generally difficult to be expected in the leveled homomorphic encryption. Finally, we successfully classify the CIFAR-100 dataset on RNS-CKKS scheme for the first time using standard ResNet-32 model, and we obtain a running time of 3,942s and an accuracy of 69.4% close to the accuracy of backbone network 69.5%.

Category / Keywords: applications / Artificial intelligence, Cheon-Kim-Kim-Song (CKKS), Convolution, Fully homomorphic encryption (FHE), Privacy-preserving machine learning (PPML), Residue number system variant Cheon-Kim-Kim-Song (RNS-CKKS), ResNet model

Date: received 23 Dec 2021, last revised 31 Dec 2021

Contact author: eslee3209 at ccl snu ac kr, shaeunsang at snu ac kr

Available format(s): PDF | BibTeX Citation

Version: 20211231:004721 (All versions of this report)

Short URL: ia.cr/2021/1688


[ Cryptology ePrint archive ]