Paper 2023/701
Differential Privacy for Free? Harnessing the Noise in Approximate Homomorphic Encryption
Abstract
Homomorphic Encryption (HE) is a type of cryptography that allows computing on encrypted data, enabling computation on sensitive data to be outsourced securely. Many popular HE schemes rely on noise for their security. On the other hand, Differential Privacy seeks to guarantee the privacy of data subjects by obscuring any one individual's contribution to an output. Many mechanisms for achieving Differential Privacy involve adding appropriate noise. In this work, we investigate the extent to which the noise native to Homomorphic Encryption can provide Differential Privacy "for free". We identify the dependence of HE noise on the underlying data as a critical barrier to privacy, and derive new results on the Differential Privacy under this constraint. We apply these ideas to a proof of concept HE application, ridge regression training using gradient descent, and are able to achieve privacy budgets of $\varepsilon \approx 2$ after 50 iterations.
Metadata
- Available format(s)
- Category
- Applications
- Publication info
- Preprint.
- Keywords
- Differential PrivacyHomomorphic EncryptionMachine Learning
- Contact author(s)
- tabitha l ogilvie @ gmail com
- History
- 2023-06-06: revised
- 2023-05-16: received
- See all versions
- Short URL
- https://ia.cr/2023/701
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2023/701, author = {Tabitha Ogilvie}, title = {Differential Privacy for Free? Harnessing the Noise in Approximate Homomorphic Encryption}, howpublished = {Cryptology {ePrint} Archive, Paper 2023/701}, year = {2023}, url = {https://eprint.iacr.org/2023/701} }