In this paper, we develop a set of tools for estimating entropy, based on mechanisms that attempt to predict the next sample in a sequence based on all previous samples. These mechanisms are called predictors. We develop a framework for using predictors to estimate entropy, and test them experimentally against both simulated and real noise sources. For comparison, we subject the entropy estimates defined in the August 2012 draft of NIST Special Publication 800-90B to the same tests, and compare their performance.
Category / Keywords: Entropy estimation, Min-entropy, Random number generation Original Publication (in the same form): IACR-CHES-2015 Date: received 16 Jun 2015 Contact author: meltemsturan at gmail com Available format(s): PDF | BibTeX Citation Version: 20150621:165408 (All versions of this report) Short URL: ia.cr/2015/600 Discussion forum: Show discussion | Start new discussion