You are looking at a specific version 20190702:142951 of this paper. See the latest version.

Paper 2019/766

Complexity of Estimating Renyi Entropy of Markov Chains

Maciej Obremski and Maciej Skorski

Abstract

Estimating entropy of random processes is one of the fundamental problems of machine learning and property testing. It has numerous applications to anything from DNA testing and predictability of human behaviour to modeling neural activity and cryptography. We investigate the problem of Renyi entropy estimation for sources that form Markov chains. Kamath and Verdú (ISIT’16) showed that good mixing properties are essential for that task. We show that even with very good mixing time, estimation of min-entropy requires $\Omega(K^2)$ sample size, while collision entropy requires $\Omega(K^{3/2})$ samples, where K is the size of the alphabet. Our results hold both in asymptotic and non-asymptotic regimes. We achieve the results by applying Le Cam’s method to two Markov chains which differ by an appropriately chosen sparse perturbation; the discrepancy between these chains is estimated with help of perturbation theory. Our techniques might be of independent interest.

Metadata
Available format(s)
PDF
Category
Foundations
Publication info
Preprint. MINOR revision.
Keywords
information theoryRenyi entropyMin-entropyLower bounds
Contact author(s)
obremski math @ gmail com
History
2020-05-21: revised
2019-07-02: received
See all versions
Short URL
https://ia.cr/2019/766
License
Creative Commons Attribution
CC BY
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.