You are looking at a specific version 20210311:190838 of this paper. See the latest version.

Paper 2021/324

Private AI: Machine Learning on Encrypted Data

Kristin E. Lauter

Abstract

As the world adopts Artificial Intelligence (AI), the privacy risks are many. AI can improve our lives, but may leak or misuse our private data. Private AI is based on Homomorphic Encryption (HE), a new encryption paradigm which allows the cloud to operate on private data in encrypted form, without ever decrypting it, enabling private training and private prediction with AI algorithms. The 2016 ICML CryptoNets [26] paper demonstrated for the first time evaluation of neural net predictions on homomorphically encrypted data, and opened new research directions combining machine learning and cryptography. The security of Homomorphic Encryption is based on hard problems in mathematics involving lattices, a candidate for post-quantum cryptography. This paper gives an overview of my Invited Plenary Lecture at the International Congress of Industrial and Applied Mathematics (ICIAM), explaining Homomorphic Encryption, Private AI, and real-world applications.

Metadata
Available format(s)
PDF
Publication info
Published elsewhere. Minor revision. to appear in the Proceedings of ICIAM
Contact author(s)
kristinelauter @ gmail com
History
2021-03-11: received
Short URL
https://ia.cr/2021/324
License
Creative Commons Attribution
CC BY
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.