Paper 2012/466
Computational Entropy and Information Leakage
Benjamin Fuller and Leonid Reyzin
Abstract
We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of metric entropy by a factor of p and the quantity of metric entropy by log 1/p note that this means that the reduction in quantity and quality is the same, because the quantity of entropy is measured on logarithmic scale). Our result improves previous bounds of Dziembowski and Pietrzak (FOCS 2008), where the loss in the \emph{quantity} of entropy was related to its original quality. The use of metric entropy simplifies the analogous the result of Reingold et. al. (FOCS 2008) for HILL entropy. Further, we simplify dealing with information leakage by investigating conditional metric entropy. We show that, conditioned on leakage of \lambda bits, metric entropy gets reduced by a factor 2^\lambda in quality and \lambda in quantity.
Note: Most of the results of this paper have been incorporated into http://eprint.iacr.org/2012/005 (for an application to deterministic encryption), but this paper has additional results as detailed on the title page.
Metadata
- Available format(s)
- Category
- Foundations
- Publication info
- Published elsewhere. Unknown where it was published
- Keywords
- entropyinformation leakage
- Contact author(s)
- reyzin @ cs bu edu
- History
- 2012-08-18: received
- Short URL
- https://ia.cr/2012/466
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2012/466, author = {Benjamin Fuller and Leonid Reyzin}, title = {Computational Entropy and Information Leakage}, howpublished = {Cryptology {ePrint} Archive, Paper 2012/466}, year = {2012}, url = {https://eprint.iacr.org/2012/466} }