Paper 2019/766
Complexity of Estimating Renyi Entropy of Markov Chains
Maciej Obremski and Maciej Skorski
Abstract
Estimating entropy of random processes is one of the fundamental problems of machine learning and property testing. It has numerous applications to anything from DNA testing and predictability of human behaviour to modeling neural activity and cryptography. We investigate the problem of Renyi entropy estimation for sources that form Markov chains. Kamath and Verdú (ISIT’16) showed that good mixing properties are essential for that task. We show that even with very good mixing time, estimation of min-entropy requires $\Omega(K^2)$ sample size, while collision entropy requires $\Omega(K^{3/2})$ samples, where K is the size of the alphabet. Our results hold both in asymptotic and non-asymptotic regimes. We achieve the results by applying Le Cam’s method to two Markov chains which differ by an appropriately chosen sparse perturbation; the discrepancy between these chains is estimated with help of perturbation theory. Our techniques might be of independent interest.
Metadata
- Available format(s)
- Category
- Foundations
- Publication info
- Published elsewhere. Minor revision. ISIT 2020
- Keywords
- information theoryRenyi entropyMin-entropyLower bounds
- Contact author(s)
- obremski math @ gmail com
- History
- 2020-05-21: revised
- 2019-07-02: received
- See all versions
- Short URL
- https://ia.cr/2019/766
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2019/766, author = {Maciej Obremski and Maciej Skorski}, title = {Complexity of Estimating Renyi Entropy of Markov Chains}, howpublished = {Cryptology {ePrint} Archive, Paper 2019/766}, year = {2019}, url = {https://eprint.iacr.org/2019/766} }