ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1806.02046
111
4
v1v2 (latest)

Implicit regularization and solution uniqueness in over-parameterized matrix sensing

6 June 2018
Kelly Geyer
Amir Kalev
ArXiv (abs)PDFHTML
Abstract

We consider whether algorithmic choices in over-parameterized linear matrix factorization introduce implicit regularization. We focus on noiseless matrix sensing over rank-rrr positive semi-definite (PSD) matrices in Rn×n\mathbb{R}^{n \times n}Rn×n, with a sensing mechanism that satisfies the restricted isometry property (RIP). The algorithm we study is that of \emph{factored gradient descent}, where we model the low-rankness and PSD constraints with the factorization UU⊤UU^\topUU⊤, where U∈Rn×rU \in \mathbb{R}^{n \times r}U∈Rn×r. Surprisingly, recent work argues that the choice of r≤nr \leq nr≤n is not pivotal: even setting U∈Rn×nU \in \mathbb{R}^{n \times n}U∈Rn×n is sufficient for factored gradient descent to find the rank-rrr solution, which suggests that operating over the factors leads to an implicit regularization. In this note, we provide a different perspective. We show that, in the noiseless case, under certain conditions, the PSD constraint by itself is sufficient to lead to a unique rank-rrr matrix recovery, without implicit or explicit low-rank regularization. \emph{I.e.}, under assumptions, the set of PSD matrices, that are consistent with the observed data, is a singleton, irrespective of the algorithm used.

View on arXiv
Comments on this paper