16
7

Efficient Alternating Minimization with Applications to Weighted Low Rank Approximation

Abstract

Weighted low rank approximation is a fundamental problem in numerical linear algebra, and it has many applications in machine learning. Given a matrix MRn×nM \in \mathbb{R}^{n \times n}, a weight matrix WR0n×nW \in \mathbb{R}_{\geq 0}^{n \times n}, a parameter kk, the goal is to output two matrices U,VRn×kU, V \in \mathbb{R}^{n \times k} such that W(MUV)F\| W \circ (M - U V^\top) \|_F is minimized, where \circ denotes the Hadamard product. Such a problem is known to be NP-hard and even hard to approximate assuming Exponential Time Hypothesis [GG11, RSW16]. Meanwhile, alternating minimization is a good heuristic solution for approximating weighted low rank approximation. The work [LLR16] shows that, under mild assumptions, alternating minimization does provide provable guarantees. In this work, we develop an efficient and robust framework for alternating minimization. For weighted low rank approximation, this improves the runtime of [LLR16] from n2k2n^2 k^2 to n2kn^2k. At the heart of our work framework is a high-accuracy multiple response regression solver together with a robust analysis of alternating minimization.

View on arXiv
Comments on this paper