List-Decodable Subspace Recovery: Dimension Independent Error in
Polynomial Time
In list-decodable subspace recovery, the input is a collection of points (for some ) of which are drawn i.i.d. from a distribution with a isotropic rank covariance (the \emph{inliers}) and the rest are arbitrary, potential adversarial outliers. The goal is to recover a size list of candidate covariances that contains a close to . Two recent independent works (Raghavendra-Yau, Bakshi-Kothari (2020)) gave algorithms for this problem that work whenever satisfies certifiable anti-concentration. The running time of both these algorithms, however, is and the error bounds on grow with (which can be ). In this work, we improve on these results on all three fronts: \emph{dimension-independent} error via a faster fixed-polynomial running time under less restrictive distributional assumptions. Specifically, we give a time algorithm that outputs a list containing a satisfying . Our result only needs to have \emph{certifiably hypercontractive} degree 2 polynomials - a condition satisfied by a much broader family of distributions in contrast to certifiable anticoncentration. As a result, in addition to Gaussians, our algorithm applies to uniform distribution on the hypercube and -ary cubes and arbitrary product distributions with subgaussian marginals. Prior work (Raghavendra and Yau, 2020) had identified such distributions as potential hard examples as such distributions do not exhibit strong enough anti-concentration. When satisfies certifiable anti-concentration, we obtain a stronger error guarantee of for any arbitrary in time.
View on arXiv