The Complexity of Dynamic Least-Squares Regression

We settle the complexity of dynamic least-squares regression (LSR), where rows and labels can be adaptively inserted and/or deleted, and the goal is to efficiently maintain an -approximate solution to for all . We prove sharp separations ( vs. ) between the amortized update time of: (i) Fully vs. Partially dynamic -LSR; (ii) High vs. low-accuracy LSR in the partially-dynamic (insertion-only) setting. Our lower bounds follow from a gap-amplification reduction -- reminiscent of iterative refinement -- rom the exact version of the Online Matrix Vector Conjecture (OMv) [HKNS15], to constant approximate OMv over the reals, where the -th online product only needs to be computed to -relative error. All previous fine-grained reductions from OMv to its approximate versions only show hardness for inverse polynomial approximation (additive or multiplicative) . This result is of independent interest in fine-grained complexity and for the investigation of the OMv Conjecture, which is still widely open.
View on arXiv