ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.02905
11
6

Escaping Saddle Points in Ill-Conditioned Matrix Completion with a Scalable Second Order Method

7 September 2020
C. Kümmerle
C. M. Verdun
ArXivPDFHTML
Abstract

We propose an iterative algorithm for low-rank matrix completion that can be interpreted as both an iteratively reweighted least squares (IRLS) algorithm and a saddle-escaping smoothing Newton method applied to a non-convex rank surrogate objective. It combines the favorable data efficiency of previous IRLS approaches with an improved scalability by several orders of magnitude. Our method attains a local quadratic convergence rate already for a number of samples that is close to the information theoretical limit. We show in numerical experiments that unlike many state-of-the-art approaches, our approach is able to complete very ill-conditioned matrices with a condition number of up to 101010^{10}1010 from few samples.

View on arXiv
Comments on this paper