206

Riemannian stochastic quasi-Newton algorithm with variance reduction and its convergence analysis

Abstract

Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite number of loss functions. The present paper proposes a Riemannian stochastic quasi-Newton algorithm with variance reduction (R-SQN-VR). The key challenges of averaging, adding, and subtracting multiple gradients are addressed with notions of retraction and vector transport. We present a global convergence analysis and a local convergence rate analysis of R-SQN-VR under some natural assumptions. The proposed algorithm is applied to the Karcher mean computation on the symmetric positive-definite manifold and low-rank matrix completion on the Grassmann manifold. In all cases, the proposed algorithm outperforms the Riemannian stochastic gradient descent and the Riemannian stochastic variance reduction algorithms.

View on arXiv
Comments on this paper