Signed Graph Metric Learning via Gershgorin Disc Alignment

Given a convex and differentiable objective for a real, symmetric matrix in the positive definite (PD) cone---used to compute Mahalanobis distances---we propose a fast general metric learning framework that is entirely projection-free. We first assume that resides in a space of generalized graph Laplacian matrices (graph metric matrices) corresponding to balanced signed graphs. Unlike low-rank metric matrices common in the literature, includes the important diagonal-only matrices as a special case. The key theorem to circumvent full eigen-decomposition and enable fast metric matrix optimization is Gershgorin disc alignment (GDA): given graph metric matrix and diagonal matrix , where and \v is the first eigenvector of , we prove that Gershgorin disc left-ends of similar transform are perfectly aligned at the smallest eigenvalue . Using this theorem, we replace the PD cone constraint in the metric learning problem with tightest possible linear constraints per iteration, so that the alternating optimization of the diagonal / off-diagonal terms in can be solved efficiently as linear programs via Frank-Wolfe iterations. We update \v using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as matrix entries in are optimized successively. Experiments show that our graph metric optimization is significantly faster than cone-projection methods, and produces competitive binary classification performance.
View on arXiv