Paper 2014/243
Key Derivation From Noisy Sources With More Errors Than Entropy
Ran Canetti and Benjamin Fuller and Omer Paneth and Leonid Reyzin
Abstract
Fuzzy extractors convert a noisy source of entropy into a consistent uniformly-distributed key. In the process of eliminating noise, they lose some of the entropy of the original source---in the worst case, as much as the logarithm of the number of correctable error patterns. We call what is left after this worst-case loss the minimum usable entropy. Unfortunately, this quantity is negative for some sources that are important in practice. Most known approaches for building fuzzy extractors work in the worst case and cannot be used when the minimum usable entropy is negative. We construct the first fuzzy extractors that work for a large class of distributions that have negative minimum usable entropy. Their security is computational. They correct Hamming errors over a large alphabet. In order to avoid the worst-case loss, they necessarily restrict distributions for which they work. Our first construction requires high individual entropy of a constant fraction of symbols, but permits symbols to be dependent. Our second construction requires a constant fraction of symbols to have a constant amount of entropy conditioned on prior symbols. The constructions can be implemented efficiently based on number-theoretic assumptions or assumptions on cryptographic hash functions.
Note: In submission.
Metadata
- Available format(s)
- Category
- Foundations
- Publication info
- Preprint. MINOR revision.
- Keywords
- fuzzy extractorskey derivationerror-correcting codescomputational entropypoint obfuscation
- Contact author(s)
- bfuller @ cs bu edu
- History
- 2020-08-26: last of 5 revisions
- 2014-04-18: received
- See all versions
- Short URL
- https://ia.cr/2014/243
- License
-
CC BY