ResearchTrend.AI
  • Communities
  • Connect sessions
  • AI calendar
  • Organizations
  • Join Slack
  • Contact Sales
Papers
Communities
Social Events
Terms and Conditions
Pricing
Contact Sales
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2026 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2004.08436
  4. Cited By
Analyzing the discrepancy principle for kernelized spectral filter
  learning algorithms

Analyzing the discrepancy principle for kernelized spectral filter learning algorithms

Journal of machine learning research (JMLR), 2020
17 April 2020
Alain Celisse
Martin Wahl
ArXiv (abs)PDFHTML

Papers citing "Analyzing the discrepancy principle for kernelized spectral filter learning algorithms"

14 / 14 papers shown
EarlyStopping: Implicit Regularization for Iterative Learning Procedures in Python
EarlyStopping: Implicit Regularization for Iterative Learning Procedures in Python
Eric Ziebell
Ratmir Miftachov
Bernhard Stankewitz
Laura Hucker
BDL
313
2
0
20 Mar 2025
On the Saturation Effects of Spectral Algorithms in Large Dimensions
On the Saturation Effects of Spectral Algorithms in Large DimensionsNeural Information Processing Systems (NeurIPS), 2025
Weihao Lu
Haobo Zhang
Yicheng Li
Q. Lin
368
2
0
01 Mar 2025
Diffusion-based Semi-supervised Spectral Algorithm for Regression on
  Manifolds
Diffusion-based Semi-supervised Spectral Algorithm for Regression on Manifolds
Weichun Xia
Jiaxin Jiang
Lei Shi
217
0
0
18 Oct 2024
Spectral Algorithms on Manifolds through Diffusion
Spectral Algorithms on Manifolds through Diffusion
Weichun Xia
Lei Shi
375
1
0
06 Mar 2024
Neural Network-Based Score Estimation in Diffusion Models: Optimization
  and Generalization
Neural Network-Based Score Estimation in Diffusion Models: Optimization and GeneralizationInternational Conference on Learning Representations (ICLR), 2024
Yinbin Han
Meisam Razaviyayn
Renyuan Xu
DiffM
594
31
0
28 Jan 2024
Adaptive Parameter Selection for Kernel Ridge Regression
Adaptive Parameter Selection for Kernel Ridge Regression
Shao-Bo Lin
211
7
0
10 Dec 2023
Adaptive Distributed Kernel Ridge Regression: A Feasible Distributed
  Learning Scheme for Data Silos
Adaptive Distributed Kernel Ridge Regression: A Feasible Distributed Learning Scheme for Data Silos
Di Wang
Xiaotong Liu
Shao-Bo Lin
Ding-Xuan Zhou
267
2
0
08 Sep 2023
On the Optimality of Misspecified Kernel Ridge Regression
On the Optimality of Misspecified Kernel Ridge RegressionInternational Conference on Machine Learning (ICML), 2023
Haobo Zhang
Yicheng Li
Weihao Lu
Qian Lin
325
18
0
12 May 2023
On the Optimality of Misspecified Spectral Algorithms
On the Optimality of Misspecified Spectral AlgorithmsJournal of machine learning research (JMLR), 2023
Hao Zhang
Yicheng Li
Qian Lin
370
21
0
27 Mar 2023
Learning Lipschitz Functions by GD-trained Shallow Overparameterized
  ReLU Neural Networks
Learning Lipschitz Functions by GD-trained Shallow Overparameterized ReLU Neural Networks
Ilja Kuzborskij
Csaba Szepesvári
344
6
0
28 Dec 2022
A note on the prediction error of principal component regression in high
  dimensions
A note on the prediction error of principal component regression in high dimensionsTheory of Probability and Mathematical Statistics (TPMS), 2022
L. Hucker
Martin Wahl
298
7
0
09 Dec 2022
Early stopping for $ L^2 $-boosting in high-dimensional linear models
Early stopping for $ L^2 $-boosting in high-dimensional linear modelsAnnals of Statistics (Ann. Stat.), 2022
Bernhard Stankewitz
130
3
0
14 Oct 2022
From inexact optimization to learning via gradient concentration
From inexact optimization to learning via gradient concentrationComputational optimization and applications (COA), 2021
Bernhard Stankewitz
Nicole Mücke
Lorenzo Rosasco
384
6
0
09 Jun 2021
Minimum discrepancy principle strategy for choosing $k$ in $k$-NN
  regression
Minimum discrepancy principle strategy for choosing kkk in kkk-NN regression
Yaroslav Averyanov
Alain Celisse
590
0
0
20 Aug 2020
1
Page 1 of 1