Paper 2021/324

Private AI: Machine Learning on Encrypted Data

Kristin E. Lauter


As the world adopts Artificial Intelligence (AI), the privacy risks are many. AI can improve our lives, but may leak or misuse our private data. Private AI is based on Homomorphic Encryption (HE), a new encryption paradigm which allows the cloud to operate on private data in encrypted form, without ever decrypting it, enabling private training and private prediction with AI algorithms. The 2016 ICML CryptoNets [26] paper demonstrated for the first time evaluation of neural net predictions on homomorphically encrypted data, and opened new research directions combining machine learning and cryptography. The security of Homomorphic Encryption is based on hard problems in mathematics involving lattices, a candidate for post-quantum cryptography. This paper gives an overview of my Invited Plenary Lecture at the International Congress of Industrial and Applied Mathematics (ICIAM), explaining Homomorphic Encryption, Private AI, and real-world applications.

Available format(s)
Publication info
Published elsewhere. MINOR appear in the Proceedings of ICIAM
Contact author(s)
kristinelauter @ gmail com
2021-03-11: received
Short URL
Creative Commons Attribution


      author = {Kristin E.  Lauter},
      title = {Private AI: Machine Learning on Encrypted Data},
      howpublished = {Cryptology ePrint Archive, Paper 2021/324},
      year = {2021},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.