**A Counterexample to the Chain Rule for Conditional HILL Entropy**

*Stephan Krenn and Krzysztof Pietrzak and Akshay Wadia and Daniel Wichs*

**Abstract: **Most entropy notions $H(.)$ like Shannon or min-entropy satisfy a chain rule stating that for random variables $X,Z$ and $A$ we have $H(X|Z,A)\ge H(X|Z)-|A|$. That is, by conditioning on $A$ the entropy of $X$ can decrease by at most the bitlength $|A|$ of $A$.
Such chain rules are known to hold for some computational entropy notions like
Yao's and unpredictability-entropy. For HILL entropy, the computational analogue of
min-entropy, the chain rule is of special interest and has found many applications, including leakage-resilient cryptography, deterministic encryption and memory delegation.
These applications rely on restricted special cases of the chain rule. Whether the chain rule for conditional HILL entropy holds in general was an open problem for which we give a strong negative answer: We construct joint distributions $(X,Z,A)$, where $A$ is a
distribution over a \emph{single} bit, such that the HILL entropy $H_\infty(X|Z)$ is
large but $H_\infty(X|Z,A)$ is basically zero.

Our counterexample just makes the minimal assumption that ${\bf NP}\nsubseteq{\bf P/poly}$. Under the stronger assumption that injective one-way function exist, we can make all the distributions efficiently samplable.

Finally, we show that some more sophisticated cryptographic objects like lossy functions can be used to sample a distribution constituting a counterexample to the chain rule making only a single invocation to the underlying object.

**Category / Keywords: **foundations / Computational Entropy, HILL Entropy, Chain Rule, Lossy Functions, Deniable Encryption

**Original Publication**** (with major differences): **IACR-TCC-2013

**Date: **received 30 Aug 2014

**Contact author: **krzpie at gmail com

**Available format(s): **PDF | BibTeX Citation

**Version: **20140831:131423 (All versions of this report)

**Short URL: **ia.cr/2014/678

[ Cryptology ePrint archive ]