Paper 2019/264

Unifying computational entropies via Kullback-Leibler divergence

Rohit Agrawal, Yi-Hsiu Chen, Thibaut Horel, and Salil Vadhan

Abstract

We introduce hardness in relative entropy, a new notion of hardness for search problems which on the one hand is satisfied by all one-way functions and on the other hand implies both next-block pseudoentropy and inaccessible entropy, two forms of computational entropy used in recent constructions of pseudorandom generators and statistically hiding commitment schemes, respectively. Thus, hardness in relative entropy unifies the latter two notions of computational entropy and sheds light on the apparent "duality" between them. Additionally, it yields a more modular and illuminating proof that one-way functions imply next-block inaccessible entropy, similar in structure to the proof that one-way functions imply next-block pseudoentropy (Vadhan and Zheng, STOC '12).

Metadata
Available format(s)
PDF
Category
Foundations
Publication info
A minor revision of an IACR publication in CRYPTO 2019
DOI
10.1007/978-3-030-26951-7_28
Keywords
one-way functionspseudo-randomnessbit commitmentinformation theorypseudoentropyinaccessible entropy
Contact author(s)
thorel @ seas harvard edu
History
2019-08-20: revised
2019-03-06: received
See all versions
Short URL
https://ia.cr/2019/264
License
Creative Commons Attribution
CC BY

BibTeX

@misc{cryptoeprint:2019/264,
      author = {Rohit Agrawal and Yi-Hsiu Chen and Thibaut Horel and Salil Vadhan},
      title = {Unifying computational entropies via Kullback-Leibler divergence},
      howpublished = {Cryptology {ePrint} Archive, Paper 2019/264},
      year = {2019},
      doi = {10.1007/978-3-030-26951-7_28},
      url = {https://eprint.iacr.org/2019/264}
}
Note: In order to protect the privacy of readers, eprint.iacr.org does not use cookies or embedded third party content.