52
8

Adaptive Stochastic Gradient Descent on the Grassmannian for Robust Low-Rank Subspace Recovery

Jun He
Abstract

In this paper, we present GASG21 (Grassmannian Adaptive Stochastic Gradient for L2,1L_{2,1} norm minimization), an adaptive stochastic gradient algorithm to robustly recover the low-rank subspace from a large matrix. In the presence of column outliers corruption, we reformulate the classical matrix L2,1L_{2,1} norm minimization problem as its stochastic programming counterpart. For each observed data vector, the low-rank subspace S\mathcal{S} is updated by taking a gradient step along the geodesic of Grassmannian. In order to accelerate the convergence rate of the stochastic gradient method, we choose to adaptively tune the constant step-size by leveraging the consecutive gradients. Numerical experiments on synthetic and real data demonstrate the efficiency and accuracy of the proposed GASG21 algorithm even with heavy column outliers corruption.

View on arXiv
Comments on this paper