146

Finding Local Minima via Stochastic Nested Variance Reduction

Abstract

We propose two algorithms that can find local minima faster than the state-of-the-art algorithms in both finite-sum and general stochastic nonconvex optimization. At the core of the proposed algorithms is One-epoch-SNVRG+\text{One-epoch-SNVRG}^+ using stochastic nested variance reduction (Zhou et al., 2018a), which outperforms the state-of-the-art variance reduction algorithms such as SCSG (Lei et al., 2017). In particular, for finite-sum optimization problems, the proposed SNVRG++Neon2finite\text{SNVRG}^{+}+\text{Neon2}^{\text{finite}} algorithm achieves O~(n1/2ϵ2+nϵH3+n3/4ϵH7/2)\tilde{O}(n^{1/2}\epsilon^{-2}+n\epsilon_H^{-3}+n^{3/4}\epsilon_H^{-7/2}) gradient complexity to converge to an (ϵ,ϵH)(\epsilon, \epsilon_H)-second-order stationary point, which outperforms SVRG+Neon2finite\text{SVRG}+\text{Neon2}^{\text{finite}} (Allen-Zhu and Li, 2017) , the best existing algorithm, in a wide regime. For general stochastic optimization problems, the proposed SNVRG++Neon2online\text{SNVRG}^{+}+\text{Neon2}^{\text{online}} achieves O~(ϵ3+ϵH5+ϵ2ϵH3)\tilde{O}(\epsilon^{-3}+\epsilon_H^{-5}+\epsilon^{-2}\epsilon_H^{-3}) gradient complexity, which is better than both SVRG+Neon2online\text{SVRG}+\text{Neon2}^{\text{online}} (Allen-Zhu and Li, 2017) and Natasha2 (Allen-Zhu, 2017) in certain regimes. Furthermore, we explore the acceleration brought by third-order smoothness of the objective function.

View on arXiv
Comments on this paper