**Collision Resistant Hashing from Learning Parity with Noise**

*Yu Yu and Jiang Zhang and Jian Weng and Chun Guo and Xiangxue Li*

**Abstract: **The Learning Parity with Noise (LPN) problem has recently found many cryptographic applications such as authentication protocols, pseudorandom generators/functions and even asymmetric tasks including public-key encryption (PKE) schemes and oblivious transfer (OT) protocols. It however remains a long-standing open problem whether LPN implies collision resistant hash (CRH) functions. Based on the recent work of Applebaum et al. (ITCS 2017), we introduce a general framework for constructing CRH from LPN for various parameter choices. We show that, just to mention a few notable ones, under any of the following hardness assumptions (for the two most common variants of LPN):

1) constant-noise LPN is $2^{n^{0.5+\epsilon}}$-hard for any constant $\epsilon>0$;

2) constant-noise LPN is $2^{\Omega(n/\log n)}$-hard given $q=poly(n)$ samples;

3) low-noise LPN (of noise rate $1/\sqrt{n}$) is $2^{\Omega(\sqrt{n}/\log n)}$-hard given $q=poly(n)$ samples.

there exists CRH functions with constant (or even poly-logarithmic) shrinkage, which can be implemented using polynomial-size depth-3 circuits with NOT, (unbounded fan-in) AND and XOR gates. Our technical route LPN->bSVP->CRH is reminiscent of the known reductions for the large-modulus analogue, i.e., LWE -> SIS -> CRH, where the binary Shortest Vector Problem (bSVP) was recently introduced by Applebaum et al. (ITCS 2017) that enables CRH in a similar manner to Ajtai's CRH functions based on the Short Integer Solution (SIS) problem.

Furthermore, under certain additional (arguably minimal) idealized assumptions, such as small-domain random functions or that a block cipher (keyed by a public random string) behaves like a random permutation, we obtain more efficient and polynomially shrinking CRH functions from $2^{n^{0.5+\epsilon}}$-hard constant-noise LPN or $2^{n^{0.25+\epsilon}}$-hard low-noise LPN. In particular, the construction of hash functions follows a conceptually simple approach: it simply divides its input into many equal-length blocks, evaluates random functions (or blockciphers) on them independently and in parallel, and then produces their XOR sum as output.

**Category / Keywords: **foundations / Learning Parity with Noise, Collision Resistant Hashing, binary Shortest Vector Problem

**Date: **received 27 Dec 2017, last revised 21 Feb 2018

**Contact author: **yuyuathk at gmail com

**Available format(s): **PDF | BibTeX Citation

**Version: **20180221:115310 (All versions of this report)

**Short URL: **ia.cr/2017/1260

**Discussion forum: **Show discussion | Start new discussion

[ Cryptology ePrint archive ]