88
3

Near-Optimal Time-Sparsity Trade-Offs for Solving Noisy Linear Equations

Abstract

We present a polynomial-time reduction from solving noisy linear equations over Z/qZ\mathbb{Z}/q\mathbb{Z} in dimension Θ(klogn/poly(logk,logq,loglogn))\Theta(k\log n/\mathsf{poly}(\log k,\log q,\log\log n)) with a uniformly random coefficient matrix to noisy linear equations over Z/qZ\mathbb{Z}/q\mathbb{Z} in dimension nn where each row of the coefficient matrix has uniformly random support of size kk. This allows us to deduce the hardness of sparse problems from their dense counterparts. In particular, we derive hardness results in the following canonical settings. 1) Assuming the \ell-dimensional (dense) LWE over a polynomial-size field takes time 2Ω()2^{\Omega(\ell)}, kk-sparse LWE in dimension nn takes time nΩ(k/(logk(logk+loglogn))).n^{\Omega({k}/{(\log k \cdot (\log k + \log \log n))})}. 2) Assuming the \ell-dimensional (dense) LPN over F2\mathbb{F}_2 takes time 2Ω(/log)2^{\Omega(\ell/\log \ell)}, kk-sparse LPN in dimension nn takes time nΩ(k/(logk(logk+loglogn)2)) .n^{\Omega(k/(\log k \cdot (\log k + \log \log n)^2))}~. These running time lower bounds are nearly tight as both sparse problems can be solved in time nO(k),n^{O(k)}, given sufficiently many samples. We further give a reduction from kk-sparse LWE to noisy tensor completion. Concretely, composing the two reductions implies that order-kk rank-2k12^{k-1} noisy tensor completion in Rnk\mathbb{R}^{n^{\otimes k}} takes time nΩ(k/logk(logk+loglogn))n^{\Omega(k/ \log k \cdot (\log k + \log \log n))}, assuming the exponential hardness of standard worst-case lattice problems.

View on arXiv
Comments on this paper