ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2207.07612
  4. Cited By
Blessing of Nonconvexity in Deep Linear Models: Depth Flattens the
  Optimization Landscape Around the True Solution

Blessing of Nonconvexity in Deep Linear Models: Depth Flattens the Optimization Landscape Around the True Solution

15 July 2022
Jianhao Ma
S. Fattahi
ArXivPDFHTML

Papers citing "Blessing of Nonconvexity in Deep Linear Models: Depth Flattens the Optimization Landscape Around the True Solution"

7 / 7 papers shown
Title
Mitigating Label Noise through Data Ambiguation
Mitigating Label Noise through Data Ambiguation
Julian Lienen
Eyke Hüllermeier
NoLa
15
6
0
23 May 2023
Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis
  Function Decomposition
Behind the Scenes of Gradient Descent: A Trajectory Analysis via Basis Function Decomposition
Jianhao Ma
Li-Zhen Guo
S. Fattahi
19
4
0
01 Oct 2022
Preconditioned Gradient Descent for Overparameterized Nonconvex Burer--Monteiro Factorization with Global Optimality Certification
Preconditioned Gradient Descent for Overparameterized Nonconvex Burer--Monteiro Factorization with Global Optimality Certification
G. Zhang
S. Fattahi
Richard Y. Zhang
32
23
0
07 Jun 2022
Robust Training under Label Noise by Over-parameterization
Robust Training under Label Noise by Over-parameterization
Sheng Liu
Zhihui Zhu
Qing Qu
Chong You
NoLa
OOD
8
105
0
28 Feb 2022
Optimization-Based Separations for Neural Networks
Optimization-Based Separations for Neural Networks
Itay Safran
Jason D. Lee
82
14
0
04 Dec 2021
Rank Overspecified Robust Matrix Recovery: Subgradient Method and Exact
  Recovery
Rank Overspecified Robust Matrix Recovery: Subgradient Method and Exact Recovery
Lijun Ding
Liwei Jiang
Yudong Chen
Qing Qu
Zhihui Zhu
18
28
0
23 Sep 2021
Benefits of depth in neural networks
Benefits of depth in neural networks
Matus Telgarsky
123
600
0
14 Feb 2016
1