ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2006.07002
  4. Cited By
Double Double Descent: On Generalization Errors in Transfer Learning
  between Linear Regression Tasks

Double Double Descent: On Generalization Errors in Transfer Learning between Linear Regression Tasks

12 June 2020
Yehuda Dar
Richard G. Baraniuk
ArXivPDFHTML

Papers citing "Double Double Descent: On Generalization Errors in Transfer Learning between Linear Regression Tasks"

4 / 4 papers shown
Title
Tight bounds for minimum l1-norm interpolation of noisy data
Tight bounds for minimum l1-norm interpolation of noisy data
Guillaume Wang
Konstantin Donhauser
Fanny Yang
77
20
0
10 Nov 2021
A Farewell to the Bias-Variance Tradeoff? An Overview of the Theory of
  Overparameterized Machine Learning
A Farewell to the Bias-Variance Tradeoff? An Overview of the Theory of Overparameterized Machine Learning
Yehuda Dar
Vidya Muthukumar
Richard G. Baraniuk
24
71
0
06 Sep 2021
Double Descent and Other Interpolation Phenomena in GANs
Double Descent and Other Interpolation Phenomena in GANs
Lorenzo Luzi
Yehuda Dar
Richard Baraniuk
8
5
0
07 Jun 2021
Phase Transitions in Transfer Learning for High-Dimensional Perceptrons
Phase Transitions in Transfer Learning for High-Dimensional Perceptrons
Oussama Dhifallah
Yue M. Lu
17
20
0
06 Jan 2021
1