ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.22578
  4. Cited By
Benignity of loss landscape with weight decay requires both large overparametrization and initialization

Benignity of loss landscape with weight decay requires both large overparametrization and initialization

28 May 2025
Etienne Boursier
Matthew Bowditch
Matthias Englert
R. Lazic
ArXiv (abs)PDFHTML

Papers citing "Benignity of loss landscape with weight decay requires both large overparametrization and initialization"

2 / 2 papers shown
Title
Convergence of Shallow ReLU Networks on Weakly Interacting Data
Convergence of Shallow ReLU Networks on Weakly Interacting Data
Léo Dana
Francis R. Bach
Loucas Pillaud-Vivien
MLT
95
2
0
24 Feb 2025
Penalising the biases in norm regularisation enforces sparsity
Penalising the biases in norm regularisation enforces sparsity
Etienne Boursier
Nicolas Flammarion
127
17
0
02 Mar 2023
1