ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.06667
  4. Cited By
Avoiding The Double Descent Phenomenon of Random Feature Models Using
  Hybrid Regularization

Avoiding The Double Descent Phenomenon of Random Feature Models Using Hybrid Regularization

11 December 2020
Kelvin K. Kan
J. Nagy
Lars Ruthotto
    AI4CE
ArXiv (abs)PDFHTMLGithub

Papers citing "Avoiding The Double Descent Phenomenon of Random Feature Models Using Hybrid Regularization"

3 / 3 papers shown
Understanding the Role of Optimization in Double Descent
Understanding the Role of Optimization in Double Descent
Chris Yuhao Liu
Jeffrey Flanigan
308
0
0
06 Dec 2023
DSD$^2$: Can We Dodge Sparse Double Descent and Compress the Neural
  Network Worry-Free?
DSD2^22: Can We Dodge Sparse Double Descent and Compress the Neural Network Worry-Free?AAAI Conference on Artificial Intelligence (AAAI), 2023
Victor Quétu
Enzo Tartaglione
417
6
0
02 Mar 2023
Conditioning of Random Feature Matrices: Double Descent and
  Generalization Error
Conditioning of Random Feature Matrices: Double Descent and Generalization Error
Zhijun Chen
Hayden Schaeffer
350
13
0
21 Oct 2021
1
Page 1 of 1