Paper 2019/766
Complexity of Estimating Renyi Entropy of Markov Chains
Maciej Obremski and Maciej Skorski
Abstract
Estimating entropy of random processes is one of the fundamental problems of
machine learning and property testing. It has numerous applications to anything
from DNA testing and predictability of human behaviour to modeling neural
activity and cryptography. We investigate the problem of Renyi entropy estimation
for sources that form Markov chains.
Kamath and Verdú (ISIT’16) showed that good mixing properties are essential for that task. We show that even with very good mixing time, estimation of min-entropy requires
Metadata
- Available format(s)
-
PDF
- Category
- Foundations
- Publication info
- Published elsewhere. Minor revision. ISIT 2020
- Keywords
- information theoryRenyi entropyMin-entropyLower bounds
- Contact author(s)
- obremski math @ gmail com
- History
- 2020-05-21: revised
- 2019-07-02: received
- See all versions
- Short URL
- https://ia.cr/2019/766
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2019/766, author = {Maciej Obremski and Maciej Skorski}, title = {Complexity of Estimating Renyi Entropy of Markov Chains}, howpublished = {Cryptology {ePrint} Archive, Paper 2019/766}, year = {2019}, url = {https://eprint.iacr.org/2019/766} }