ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2012.06667
  4. Cited By
Avoiding The Double Descent Phenomenon of Random Feature Models Using
  Hybrid Regularization

Avoiding The Double Descent Phenomenon of Random Feature Models Using Hybrid Regularization

11 December 2020
Kelvin K. Kan
J. Nagy
Lars Ruthotto
    AI4CE
ArXivPDFHTML

Papers citing "Avoiding The Double Descent Phenomenon of Random Feature Models Using Hybrid Regularization"

2 / 2 papers shown
Title
DSD$^2$: Can We Dodge Sparse Double Descent and Compress the Neural
  Network Worry-Free?
DSD2^22: Can We Dodge Sparse Double Descent and Compress the Neural Network Worry-Free?
Victor Quétu
Enzo Tartaglione
24
7
0
02 Mar 2023
Conditioning of Random Feature Matrices: Double Descent and
  Generalization Error
Conditioning of Random Feature Matrices: Double Descent and Generalization Error
Zhijun Chen
Hayden Schaeffer
35
12
0
21 Oct 2021
1