Online Lewis Weight Sampling

The seminal work of Cohen and Peng introduced Lewis weight sampling to the theoretical computer science community, yielding fast row sampling algorithms for approximating -dimensional subspaces of up to error. Several works have extended this important primitive to other settings, including the online coreset and sliding window models. However, these results are only for , and results for require a suboptimal samples. In this work, we design the first nearly optimal subspace embeddings for all in the online coreset and sliding window models. In both models, our algorithms store rows. This answers a substantial generalization of the main open question of [BDMMUWZ2020], and gives the first results for all . Towards our result, we give the first analysis of "one-shot'' Lewis weight sampling of sampling rows proportionally to their Lewis weights, with sample complexity for . Previously, this scheme was only known to have sample complexity , whereas is known if a more sophisticated recursive sampling is used. The recursive sampling cannot be implemented online, thus necessitating an analysis of one-shot Lewis weight sampling. Our analysis uses a novel connection to online numerical linear algebra. As an application, we obtain the first one-pass streaming coreset algorithms for approximation of important generalized linear models, such as logistic regression and -probit regression. Our upper bounds are parameterized by a complexity parameter introduced by [MSSW2018], and we show the first lower bounds showing that a linear dependence on is necessary.
View on arXiv