Paper 2014/967

A Comprehensive Comparison of Shannon Entropy and Smooth Renyi Entropy

Maciej Skorski

Abstract

We provide a new result that links two crucial entropy notions: Shannon entropy $\mathrm{H}_1$ and collision entropy $\mathrm{H}_2$. Our formula gives the \emph{worst possible} amount of collision entropy in a probability distribution, when its Shannon entropy is fixed. Our results and techniques used in the proof immediately imply many quantitatively tight separations between Shannon and smooth Renyi entropy, which were previously known as qualitative statements or one-sided bounds. In particular, we precisely calculate the number of bits that can be extracted from a Shannon entropy source, and calculate how far from the uniform distribution is a distribution with the given amount Shannon entropy. To illustrate our results we provide clear numerical examples. In the typical situation, when the gap between Shannon entropy of a distribution and its length is bigger than $1$, the length of the extracted sequence is very small, even if we allow the randomness quality to be poor. In the case of almost full entropy, where the gap is close to $0$, the $\ell_2$-distance to uniform is roughly of the same order as the gap. Therefore, it is actually not possible to decide the strong quality of supposed true randomness, {efficiently and at extremely high confidence level} , by means of Shannon entropy estimators, like Maurer's Universal Test or others. Our approach involves convex optimization techniques, applied to characterize worst case distributions, and the use of the Lambert $W$ function, by which we resolve equations coming from Shannon entropy constraints. We believe that it may be of independent interests and useful in studying Shannon entropy with constraints elsewhere.

Note: Some typos corrected

Metadata
Available format(s)
PDF
Category
Foundations
Publication info
Published elsewhere. Minor revision. IMACC 2015
Keywords
Entropy EstimatorsSmooth EntropyRandomness Extractors
Contact author(s)
maciej skorski @ gmail com
History
2015-10-08: last of 6 revisions
2014-11-28: received
See all versions
Short URL
https://ia.cr/2014/967
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2014/967,
      author = {Maciej Skorski},
      title = {A Comprehensive Comparison of Shannon Entropy and Smooth Renyi Entropy},
      howpublished = {Cryptology ePrint Archive, Paper 2014/967},
      year = {2014},
      note = {\url{https://eprint.iacr.org/2014/967}},
      url = {https://eprint.iacr.org/2014/967}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.