Paper 2012/466

Computational Entropy and Information Leakage

Benjamin Fuller and Leonid Reyzin


We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z). We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of metric entropy by a factor of p and the quantity of metric entropy by log 1/p note that this means that the reduction in quantity and quality is the same, because the quantity of entropy is measured on logarithmic scale). Our result improves previous bounds of Dziembowski and Pietrzak (FOCS 2008), where the loss in the \emph{quantity} of entropy was related to its original quality. The use of metric entropy simplifies the analogous the result of Reingold et. al. (FOCS 2008) for HILL entropy. Further, we simplify dealing with information leakage by investigating conditional metric entropy. We show that, conditioned on leakage of \lambda bits, metric entropy gets reduced by a factor 2^\lambda in quality and \lambda in quantity.

Note: Most of the results of this paper have been incorporated into (for an application to deterministic encryption), but this paper has additional results as detailed on the title page.

Available format(s)
Publication info
Published elsewhere. Unknown where it was published
entropyinformation leakage
Contact author(s)
reyzin @ cs bu edu
2012-08-18: received
Short URL
Creative Commons Attribution


      author = {Benjamin Fuller and Leonid Reyzin},
      title = {Computational Entropy and Information Leakage},
      howpublished = {Cryptology ePrint Archive, Paper 2012/466},
      year = {2012},
      note = {\url{}},
      url = {}
Note: In order to protect the privacy of readers, does not use cookies or embedded third party content.