ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1607.02387
  4. Cited By
Convergence rates of Kernel Conjugate Gradient for random design
  regression

Convergence rates of Kernel Conjugate Gradient for random design regression

8 July 2016
Gilles Blanchard
Nicole E. Kramer
ArXiv (abs)PDFHTML

Papers citing "Convergence rates of Kernel Conjugate Gradient for random design regression"

15 / 15 papers shown
Title
Comparing regularisation paths of (conjugate) gradient estimators in ridge regression
Laura Hucker
Markus Reiß
Thomas Stark
82
1
0
07 Mar 2025
Convergence Analysis of Kernel Conjugate Gradient for Functional Linear
  Regression
Convergence Analysis of Kernel Conjugate Gradient for Functional Linear Regression
Naveen Gupta
And S. SIVANANTHAN
Bharath K. Sriperumbudur
55
1
0
04 Oct 2023
An extended latent factor framework for ill-posed linear regression
An extended latent factor framework for ill-posed linear regression
G. Finocchio
Tatyana Krivobokova
70
2
0
17 Jul 2023
Random Smoothing Regularization in Kernel Gradient Descent Learning
Random Smoothing Regularization in Kernel Gradient Descent Learning
Liang Ding
Tianyang Hu
Jiahan Jiang
Donghao Li
Wei Cao
Yuan Yao
74
6
0
05 May 2023
Kernel-Based Distributed Q-Learning: A Scalable Reinforcement Learning Approach for Dynamic Treatment Regimes
Kernel-Based Distributed Q-Learning: A Scalable Reinforcement Learning Approach for Dynamic Treatment Regimes
Di Wang
Yao Wang
Shaojie Tang
OffRL
151
1
0
21 Feb 2023
From inexact optimization to learning via gradient concentration
From inexact optimization to learning via gradient concentration
Bernhard Stankewitz
Nicole Mücke
Lorenzo Rosasco
88
5
0
09 Jun 2021
Analyzing the discrepancy principle for kernelized spectral filter
  learning algorithms
Analyzing the discrepancy principle for kernelized spectral filter learning algorithms
Alain Celisse
Martin Wahl
65
18
0
17 Apr 2020
Distributed Learning with Dependent Samples
Distributed Learning with Dependent Samples
Zirui Sun
Shao-Bo Lin
58
7
0
10 Feb 2020
The Statistical Complexity of Early-Stopped Mirror Descent
The Statistical Complexity of Early-Stopped Mirror Descent
Tomas Vaskevicius
Varun Kanade
Patrick Rebeschini
90
23
0
01 Feb 2020
Adaptive Stopping Rule for Kernel-based Gradient Descent Algorithms
Xiangyu Chang
Shao-Bo Lin
62
0
0
09 Jan 2020
Kernel Conjugate Gradient Methods with Random Projections
Kernel Conjugate Gradient Methods with Random Projections
Bailey Kacsmar
Douglas R Stinson
61
4
0
05 Nov 2018
Analysis of regularized Nyström subsampling for regression functions
  of low smoothness
Analysis of regularized Nyström subsampling for regression functions of low smoothness
Shuai Lu
Peter Mathé
S. Pereverzyev
75
18
0
03 Jun 2018
Statistical Optimality of Stochastic Gradient Descent on Hard Learning
  Problems through Multiple Passes
Statistical Optimality of Stochastic Gradient Descent on Hard Learning Problems through Multiple Passes
Loucas Pillaud-Vivien
Alessandro Rudi
Francis R. Bach
183
103
0
25 May 2018
Accelerate Stochastic Subgradient Method by Leveraging Local Growth
  Condition
Accelerate Stochastic Subgradient Method by Leveraging Local Growth Condition
Yi Tian Xu
Qihang Lin
Tianbao Yang
149
11
0
04 Jul 2016
Optimal Rates For Regularization Of Statistical Inverse Learning
  Problems
Optimal Rates For Regularization Of Statistical Inverse Learning Problems
Gilles Blanchard
Nicole Mücke
493
144
0
14 Apr 2016
1