**Computational Entropy and Information Leakage**

*Benjamin Fuller and Leonid Reyzin*

**Abstract: **We investigate how information leakage reduces computational entropy of a random variable X. Recall that HILL and metric computational entropy are parameterized by quality (how distinguishable is X from a variable Z that has true entropy) and quantity (how much true entropy is there in Z).

We prove an intuitively natural result: conditioning on an event of probability p reduces the quality of metric entropy by a factor of p and the quantity of metric entropy by log 1/p note that this means that the reduction in quantity and quality is the same, because the quantity of entropy is measured on logarithmic scale). Our result improves previous bounds of Dziembowski and Pietrzak (FOCS 2008), where the loss in the \emph{quantity} of entropy was related to its original quality. The use of metric entropy simplifies the analogous the result of Reingold et. al. (FOCS 2008) for HILL entropy.

Further, we simplify dealing with information leakage by investigating conditional metric entropy. We show that, conditioned on leakage of \lambda bits, metric entropy gets reduced by a factor 2^\lambda in quality and \lambda in quantity.

**Category / Keywords: **foundations / entropy, information leakage

**Date: **received 14 Aug 2012, last revised 20 Aug 2012

**Contact author: **reyzin at cs bu edu

**Available format(s): **PDF | BibTeX Citation

**Note: **Most of the results of this paper have been incorporated into http://eprint.iacr.org/2012/005 (for an application to deterministic encryption), but this paper has additional results as detailed on the title page.

**Version: **20120820:155632 (All versions of this report)

**Discussion forum: **Show discussion | Start new discussion

[ Cryptology ePrint archive ]