Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches

Stochastic variance reduction has proven effective at accelerating first-order algorithms for solving convex finite-sum optimization tasks such as empirical risk minimization. Incorporating second-order information has proven helpful in further improving the performance of these first-order methods. Yet, comparatively little is known about the benefits of using variance reduction to accelerate popular stochastic second-order methods such as Subsampled Newton. To address this, we propose Stochastic Variance-Reduced Newton (SVRN), a finite-sum minimization algorithm that provably accelerates existing stochastic Newton methods from to passes over the data, i.e., by a factor of , where is the number of sum components and is the approximation factor in the Hessian estimate. Surprisingly, this acceleration gets more significant the larger the data size , which is a unique property of SVRN. Our algorithm retains the key advantages of Newton-type methods, such as easily parallelizable large-batch operations and a simple unit step size. We use SVRN to accelerate Subsampled Newton and Iterative Hessian Sketch algorithms, and show that it compares favorably to popular first-order methods with variance~reduction.
View on arXiv@article{dereziński2025_2206.02702, title={ Stochastic Variance-Reduced Newton: Accelerating Finite-Sum Minimization with Large Batches }, author={ Michał Dereziński }, journal={arXiv preprint arXiv:2206.02702}, year={ 2025 } }