ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.15467
  4. Cited By
Fast Global Convergence for Low-rank Matrix Recovery via Riemannian
  Gradient Descent with Random Initialization

Fast Global Convergence for Low-rank Matrix Recovery via Riemannian Gradient Descent with Random Initialization

31 December 2020
T. Hou
Zhenzhen Li
Ziyun Zhang
ArXivPDFHTML

Papers citing "Fast Global Convergence for Low-rank Matrix Recovery via Riemannian Gradient Descent with Random Initialization"

2 / 2 papers shown
Title
Tensor-on-Tensor Regression: Riemannian Optimization,
  Over-parameterization, Statistical-computational Gap, and Their Interplay
Tensor-on-Tensor Regression: Riemannian Optimization, Over-parameterization, Statistical-computational Gap, and Their Interplay
Yuetian Luo
Anru R. Zhang
21
19
0
17 Jun 2022
Nonconvex Factorization and Manifold Formulations are Almost Equivalent
  in Low-rank Matrix Optimization
Nonconvex Factorization and Manifold Formulations are Almost Equivalent in Low-rank Matrix Optimization
Yuetian Luo
Xudong Li
Anru R. Zhang
23
9
0
03 Aug 2021
1