93

Stochastic Variance-Reduced Cubic Regularized Newton Method

Abstract

We propose a stochastic variance-reduced cubic regularized Newton method for non-convex optimization. At the core of our algorithm is a novel semi-stochastic gradient along with a semi-stochastic Hessian, which are specifically designed for cubic regularization method. We show that our algorithm is guaranteed to converge to an (ϵ,ϵ)(\epsilon,\sqrt{\epsilon})-approximately local minimum within O~(n4/5/ϵ3/2)\tilde{O}(n^{4/5}/\epsilon^{3/2}) second-order oracle calls, which outperforms the state-of-the-art cubic regularization algorithms including subsampled cubic regularization. Our work also sheds light on the application of variance reduction technique to high-order non-convex optimization methods. Thorough experiments on various non-convex optimization problems support our theory.

View on arXiv
Comments on this paper