Communities
Connect sessions
AI calendar
Organizations
Join Slack
Contact Sales
Search
Open menu
Home
Papers
1608.07681
Cited By
Regularization and the small-ball method II: complexity dependent error rates
Journal of machine learning research (JMLR), 2016
27 August 2016
Guillaume Lecué
S. Mendelson
Re-assign community
ArXiv (abs)
PDF
HTML
Papers citing
"Regularization and the small-ball method II: complexity dependent error rates"
21 / 21 papers shown
Title
Complexity Dependent Error Rates for Physics-informed Statistical Learning via the Small-ball Method
Diego Marcondes
108
0
0
27 Oct 2025
Optimal kernel regression bounds under energy-bounded noise
Amon Lahr
Johannes Köhler
Anna Scampicchio
Melanie Zeilinger
199
1
0
28 May 2025
Early-Stopped Mirror Descent for Linear Regression over Convex Bodies
Tobias Wegel
Gil Kur
Patrick Rebeschini
218
0
0
05 Mar 2025
On the Importance of Gradient Norm in PAC-Bayesian Bounds
Neural Information Processing Systems (NeurIPS), 2022
Itai Gat
Yossi Adi
Alex Schwing
Tamir Hazan
BDL
200
6
0
12 Oct 2022
Measuring the Effect of Training Data on Deep Learning Predictions via Randomized Experiments
International Conference on Machine Learning (ICML), 2022
Jinkun Lin
Anqi Zhang
Mathias Lécuyer
Jinyang Li
Aurojit Panda
S. Sen
TDI
FedML
178
63
0
20 Jun 2022
Fast Rates for Noisy Interpolation Require Rethinking the Effects of Inductive Bias
International Conference on Machine Learning (ICML), 2022
Konstantin Donhauser
Nicolò Ruggeri
Stefan Stojanovic
Fanny Yang
269
25
0
07 Mar 2022
On the robustness of minimum norm interpolators and regularized empirical risk minimizers
Annals of Statistics (Ann. Stat.), 2020
Geoffrey Chinot
Matthias Löffler
Sara van de Geer
328
22
0
01 Dec 2020
Minimax Estimation of Conditional Moment Models
Neural Information Processing Systems (NeurIPS), 2020
Nishanth Dikkala
Greg Lewis
Lester W. Mackey
Vasilis Syrgkanis
479
109
0
12 Jun 2020
Generic Error Bounds for the Generalized Lasso with Sub-Exponential Data
Sampling Theory, Signal Processing, and Data Analysis (TSPDA), 2020
Martin Genzel
Christian Kipp
294
10
0
11 Apr 2020
Sum-of-squares meets square loss: Fast rates for agnostic tensor completion
Annual Conference Computational Learning Theory (COLT), 2019
Dylan J. Foster
Andrej Risteski
114
4
0
30 May 2019
Robust high dimensional learning for Lipschitz and convex losses
Geoffrey Chinot
Guillaume Lecué
M. Lerasle
353
18
0
10 May 2019
Robust learning and complexity dependent bounds for regularized problems
Geoffrey Chinot
178
2
0
06 Feb 2019
Learning with Non-Convex Truncated Losses by SGD
Yi Tian Xu
Shenghuo Zhu
Sen Yang
Chi Zhang
Rong Jin
Tianbao Yang
142
37
0
21 May 2018
Structured Recovery with Heavy-tailed Measurements: A Thresholding Procedure and Optimal Rates
Xiaohan Wei
193
11
0
16 Apr 2018
Robust 1-Bit Compressed Sensing via Hinge Loss Minimization
Martin Genzel
A. Stollenwerk
144
7
0
13 Apr 2018
Robust machine learning by median-of-means : theory and practice
Guillaume Lecué
M. Lerasle
OOD
398
166
0
28 Nov 2017
Solving Equations of Random Convex Functions via Anchored Regression
Foundations of Computational Mathematics (FoCM), 2017
S. Bahmani
Justin Romberg
238
10
0
17 Feb 2017
Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions
Annals of Statistics (Ann. Stat.), 2017
Pierre Alquier
V. Cottet
Guillaume Lecué
381
61
0
05 Feb 2017
Learning from MOM's principles: Le Cam's approach
Stochastic Processes and their Applications (SPA), 2017
Lecué Guillaume
Lerasle Matthieu
226
55
0
08 Jan 2017
A Convex Program for Mixed Linear Regression with a Recovery Guarantee for Well-Separated Data
Neural Information Processing Systems (NeurIPS), 2016
Paul Hand
Babhru Joshi
163
13
0
19 Dec 2016
On optimality of empirical risk minimization in linear aggregation
Adrien Saumard
250
21
0
11 May 2016
1