ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1011.6256
141
663

Nuclear norm penalization and optimal rates for noisy low rank matrix completion

29 November 2010
V. Koltchinskii
Alexandre B. Tsybakov
Karim Lounici
ArXivPDFHTML
Abstract

This paper deals with the trace regression model where nnn entries or linear combinations of entries of an unknown m1×m2m_1\times m_2m1​×m2​ matrix A0A_0A0​ corrupted by noise are observed. We propose a new nuclear norm penalized estimator of A0A_0A0​ and establish a general sharp oracle inequality for this estimator for arbitrary values of n,m1,m2n,m_1,m_2n,m1​,m2​ under the condition of isometry in expectation. Then this method is applied to the matrix completion problem. In this case, the estimator admits a simple explicit form and we prove that it satisfies oracle inequalities with faster rates of convergence than in the previous works. They are valid, in particular, in the high-dimensional setting m1m2≫nm_1m_2\gg nm1​m2​≫n. We show that the obtained rates are optimal up to logarithmic factors in a minimax sense and also derive, for any fixed matrix A0A_0A0​, a non-minimax lower bound on the rate of convergence of our estimator, which coincides with the upper bound up to a constant factor. Finally, we show that our procedure provides an exact recovery of the rank of A0A_0A0​ with probability close to 1. We also discuss the statistical learning setting where there is no underlying model determined by A0A_0A0​ and the aim is to find the best trace regression model approximating the data.

View on arXiv
Comments on this paper