ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.10485
33
4
v1v2v3v4 (latest)

Fast Graph Metric Learning via Gershgorin Disc Alignment

28 January 2020
Cheng Yang
Gene Cheung
Wei Hu
ArXiv (abs)PDFHTML
Abstract

We propose a fast general projection-free metric learning framework, where the minimization objective min⁡\M∈\cSQ(\M)\min_{\M \in \cS} Q(\M)min\M∈\cS​Q(\M) is a convex differentiable function of the metric matrix \M\M\M, and \M\M\M resides in the set \cS\cS\cS of generalized graph Laplacian matrices for connected graphs with positive edge weights and node degrees. Unlike low-rank metric matrices common in the literature, \cS\cS\cS includes the important positive-diagonal-only matrices as a special case in the limit. The key idea for fast optimization is to rewrite the positive definite cone constraint in \cS\cS\cS as signal-adaptive linear constraints via Gershgorin disc alignment, so that the alternating optimization of the diagonal and off-diagonal terms in \M\M\M can be solved efficiently as linear programs via Frank-Wolfe iterations. We prove that the Gershgorin discs can be aligned perfectly using the first eigenvector \v of \M\M\M, which we update iteratively using Locally Optimal Block Preconditioned Conjugate Gradient (LOBPCG) with warm start as diagonal / off-diagonal terms are optimized. Experiments show that our efficiently computed graph metric matrices outperform metrics learned using competing methods in terms of classification tasks.

View on arXiv
Comments on this paper