Paper 2022/437
Continuous LWE is as Hard as LWE & Applications to Learning Gaussian Mixtures
Abstract
We show direct and conceptually simple reductions between the classical learning with errors (LWE) problem and its continuous analog, CLWE (Bruna, Regev, Song and Tang, STOC 2021). This allows us to bring to bear the powerful machinery of LWE-based cryptography to the applications of CLWE. For example, we obtain the hardness of CLWE under the classical worst-case hardness of the gap shortest vector problem. Previously, this was known only under quantum worst-case hardness of lattice problems. More broadly, with our reductions between the two problems, any future developments to LWE will also apply to CLWE and its downstream applications. As a concrete application, we show an improved hardness result for density estimation for mixtures of Gaussians. In this computational problem, given sample access to a mixture of Gaussians, the goal is to output a function that estimates the density function of the mixture. Under the (plausible and widely believed) exponential hardness of the classical LWE problem, we show that Gaussian mixture density estimation in $\mathbb{R}^n$ with roughly $\log n$ Gaussian components given $\mathsf{poly}(n)$ samples requires time quasi-polynomial in $n$. Under the (conservative) polynomial hardness of LWE, we show hardness of density estimation for $n^{\epsilon}$ Gaussians for any constant $\epsilon > 0$, which improves on Bruna, Regev, Song and Tang (STOC 2021), who show hardness for at least $\sqrt{n}$ Gaussians under polynomial (quantum) hardness assumptions. Our key technical tool is a reduction from classical LWE to LWE with $k$-sparse secrets where the multiplicative increase in the noise is only $O(\sqrt{k})$, independent of the ambient dimension $n$.
Note: Fixed bugs in Lemma 9 and Section 6.
Metadata
- Available format(s)
- Category
- Foundations
- Publication info
- Published elsewhere. arXiv
- Contact author(s)
-
agupte @ mit edu
nvafa @ mit edu
vinodv @ mit edu - History
- 2022-11-02: last of 2 revisions
- 2022-04-06: received
- See all versions
- Short URL
- https://ia.cr/2022/437
- License
-
CC BY
BibTeX
@misc{cryptoeprint:2022/437, author = {Aparna Gupte and Neekon Vafa and Vinod Vaikuntanathan}, title = {Continuous {LWE} is as Hard as {LWE} & Applications to Learning Gaussian Mixtures}, howpublished = {Cryptology {ePrint} Archive, Paper 2022/437}, year = {2022}, url = {https://eprint.iacr.org/2022/437} }