Improved Global Guarantees for the Nonconvex Burer--Monteiro Factorization via Rank Overparameterization

We consider minimizing a twice-differentiable, -smooth, and -strongly convex objective over an positive semidefinite matrix , under the assumption that the minimizer has low rank . Following the Burer--Monteiro approach, we instead minimize the nonconvex objective over a factor matrix of size . This substantially reduces the number of variables from to as few as and also enforces positive semidefiniteness for free, but at the cost of giving up the convexity of the original problem. In this paper, we prove that if the search rank is overparameterized by a \emph{constant factor} with respect to the true rank , namely as in , then despite nonconvexity, local optimization is guaranteed to globally converge from any initial point to the global optimum. This significantly improves upon a previous rank overparameterization threshold of , which we show is sharp in the absence of smoothness and strong convexity, but would increase the number of variables back up to . Conversely, without rank overparameterization, we prove that such a global guarantee is possible if and only if is almost perfectly conditioned, with a condition number of . Therefore, we conclude that a small amount of overparameterization can lead to large improvements in theoretical guarantees for the nonconvex Burer--Monteiro factorization.
View on arXiv