ReSQueing Parallel and Private Stochastic Convex Optimization

We introduce a new tool for stochastic convex optimization (SCO): a Reweighted Stochastic Query (ReSQue) estimator for the gradient of a function convolved with a (Gaussian) probability density. Combining ReSQue with recent advances in ball oracle acceleration [CJJJLST20, ACJJS21], we develop algorithms achieving state-of-the-art complexities for SCO in parallel and private settings. For a SCO objective constrained to the unit ball in , we obtain the following results (up to polylogarithmic factors). We give a parallel algorithm obtaining optimization error with gradient oracle query depth and gradient queries in total, assuming access to a bounded-variance stochastic gradient estimator. For , our algorithm matches the state-of-the-art oracle depth of [BJLLS19] while maintaining the optimal total work of stochastic gradient descent. Given samples of Lipschitz loss functions, prior works [BFTT19, BFGT20, AFKT21, KLL21] established that if , -differential privacy is attained at no asymptotic cost to the SCO utility. However, these prior works all required a superlinear number of gradient queries. We close this gap for sufficiently large , by using ReSQue to design an algorithm with near-linear gradient query complexity in this regime.
View on arXiv