Lasso and Ridge are important minimization problems in machine learning and statistics. They are versions of linear regression with squared loss where the vector of coefficients is constrained in either -norm (for Lasso) or in -norm (for Ridge). We study the complexity of quantum algorithms for finding -minimizers for these minimization problems. We show that for Lasso we can get a quadratic quantum speedup in terms of by speeding up the cost-per-iteration of the Frank-Wolfe algorithm, while for Ridge the best quantum algorithms are linear in , as are the best classical algorithms. As a byproduct of our quantum lower bound for Lasso, we also prove the first classical lower bound for Lasso that is tight up to polylog-factors.
View on arXiv