**Key Derivation Without Entropy Waste**

*Yevgeniy Dodis and Krzysztof Pietrzak and Daniel Wichs*

**Abstract: **We revisit the classical problem of converting an imperfect source of randomness into a usable cryptographic key. Assume that we have some cryptographic application $P$ that expects a uniformly random $m$-bit key $R$ and ensures that the best attack (in some complexity class) against $P(R)$ has success probability at most $\delta$. Our goal is to design a key-derivation function (KDF) $h$ that converts any random source $X$ of min-entropy $k$ into a sufficiently ``good'' key $h(X)$,
guaranteeing that $P(h(X))$ has comparable security $\delta'$ which is `close' to $\delta$.

Seeded randomness extractors provide a generic way to solve this problem for \emph{all} applications $P$, with resulting security $\delta' = O(\delta)$, provided that we start with entropy $k\ge m+2\logdel-O(1)$. By a result of Radhakrishnan and Ta-Shma, this bound on $k$ (called the ``RT-bound'') is also known to be tight in general. Unfortunately, in many situations the loss of $2\logdel$ bits of entropy is unacceptable. This motivates the study KDFs with less entropy waste by placing some restrictions on the source $X$ or the application $P$.

In this work we obtain the following new positive and negative results in this regard:

- Efficient samplability of the source $X$ does not help beat the RT-bound for general applications. This resolves the SRT (samplable RT) conjecture of Dachman-Soled et al.~\cite{DGKM12} in the affirmative, and also shows that the existence of computationally-secure extractors beating the RT-bound implies the existence of one-way functions.

- We continue in the line of work initiated by Barak et al. \cite{BDK+11} and construct new information-theoretic KDFs which beat the RT-bound for large but restricted classes of applications. Specifically, we design efficient KDFs that work for all unpredictability applications $P$ (e.g., signatures, MACs, one-way functions, etc.) and can either: (1) extract \emph{all} of the entropy $k = m$ with a very modest security loss $\delta'=O(\delta\cdot \log (1/\delta))$, or alternatively, (2) achieve essentially optimal security $\delta' = O(\delta)$ with a very modest entropy loss $k \ge m+\log\log(1/\delta)$. In comparison, the best prior results from \cite{BDK+11} for this class of applications would only guarantee $\delta'=O(\sqrt{\delta})$ when $k=m$, and would need $k\ge m+\logdel$ to get $\delta'=O(\delta)$.

- The weaker bounds of \cite{BDK+11} hold for a larger class of so-called ``square-friendly'' applications (which includes all unpredictability, but also some important indistinguishability, applications). Unfortunately, we show that these weaker bounds are tight for the larger class of applications.

- We abstract out a clean, information-theoretic notion of $(k,\delta,\delta')$-unpredictability extractors, which guarantee ``induced'' security $\delta'$ for any $\delta$-secure unpredictability application $P$, and characterize the parameters achievable for such unpredictability extractors. Of independent interest, we also relate this notion to the previously-known notion of (min-entropy) condensers, and improve the state-of-the-art parameters for such condensers.

**Category / Keywords: **foundations / Information Theory, Extractors, Condensers, Key Derivation

**Date: **received 29 Oct 2013

**Contact author: **wichs at ccs neu edu

**Available format(s): **PDF | BibTeX Citation

**Version: **20131103:171951 (All versions of this report)

**Discussion forum: **Show discussion | Start new discussion

[ Cryptology ePrint archive ]