ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2110.06910
  4. Cited By
On the Double Descent of Random Features Models Trained with SGD
v1v2v3v4v5v6 (latest)

On the Double Descent of Random Features Models Trained with SGD

13 October 2021
Fanghui Liu
Johan A. K. Suykens
Volkan Cevher
    MLT
ArXiv (abs)PDFHTML

Papers citing "On the Double Descent of Random Features Models Trained with SGD"

5 / 5 papers shown
MUSO: Achieving Exact Machine Unlearning in Over-Parameterized Regimes
MUSO: Achieving Exact Machine Unlearning in Over-Parameterized RegimesMachine-mediated learning (ML), 2024
Ruikai Yang
Mingzhen He
Zhengbao He
Youmei Qiu
Xiaolin Huang
MUBDL
426
3
0
11 Oct 2024
Learning Analysis of Kernel Ridgeless Regression with Asymmetric Kernel
  Learning
Learning Analysis of Kernel Ridgeless Regression with Asymmetric Kernel Learning
Fan He
Mingzhe He
Lei Shi
Xiaolin Huang
Johan A. K. Suykens
232
2
0
03 Jun 2024
Orthogonal Random Features: Explicit Forms and Sharp Inequalities
Orthogonal Random Features: Explicit Forms and Sharp Inequalities
N. Demni
Hachem Kadri
355
1
0
11 Oct 2023
Gibbs-Based Information Criteria and the Over-Parameterized Regime
Gibbs-Based Information Criteria and the Over-Parameterized RegimeInternational Conference on Artificial Intelligence and Statistics (AISTATS), 2023
Haobo Chen
Yuheng Bu
Greg Wornell
404
1
0
08 Jun 2023
Double-descent curves in neural networks: a new perspective using
  Gaussian processes
Double-descent curves in neural networks: a new perspective using Gaussian processesAAAI Conference on Artificial Intelligence (AAAI), 2021
Ouns El Harzli
Bernardo Cuenca Grau
Guillermo Valle Pérez
A. Louis
522
6
0
14 Feb 2021
1
Page 1 of 1