Paper 2017/035

Privacy-Preserving Classification on Deep Neural Network

Hervé Chabanne, Amaury de Wargny, Jonathan Milgram, Constance Morel, and Emmanuel Prouff

Abstract

Neural Networks (NN) are today increasingly used in Machine Learning where they have become deeper and deeper to accurately model or classify high-level abstractions of data. Their development however also gives rise to important data privacy risks. This observation motives Microsoft researchers to propose a framework, called Cryptonets. The core idea is to combine simplifications of the NN with Fully Homomorphic Encryptions (FHE) techniques to get both confidentiality of the manipulated data and efficiency of the processing. While efficiency and accuracy are demonstrated when the number of non-linear layers is small (eg $2$), Cryptonets unfortunately becomes ineffective for deeper NNs which let the problem of privacy preserving matching open in these contexts. This work successfully addresses this problem by combining the original ideas of Cryptonets' solution with the batch normalization principle introduced at ICML 2015 by Ioffe and Szegedy. We experimentally validate the soundness of our approach with a neural network with $6$ non-linear layers. When applied to the MNIST database, it competes the accuracy of the best non-secure versions, thus significantly improving Cryptonets.

Note: Presented at Real World Cryptography 2017

Metadata
Available format(s)
PDF
Category
Applications
Publication info
Preprint. MINOR revision.
Keywords
Machine LearningFHE
Contact author(s)
emmanuel prouff @ safrangroup com
History
2017-03-24: last of 3 revisions
2017-01-13: received
See all versions
Short URL
https://ia.cr/2017/035
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2017/035,
      author = {Hervé Chabanne and Amaury de Wargny and Jonathan Milgram and Constance Morel and Emmanuel Prouff},
      title = {Privacy-Preserving Classification on Deep Neural Network},
      howpublished = {Cryptology {ePrint} Archive, Paper 2017/035},
      year = {2017},
      url = {https://eprint.iacr.org/2017/035}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.