ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1608.07681
  4. Cited By
Regularization and the small-ball method II: complexity dependent error
  rates

Regularization and the small-ball method II: complexity dependent error rates

27 August 2016
Guillaume Lecué
S. Mendelson
ArXivPDFHTML

Papers citing "Regularization and the small-ball method II: complexity dependent error rates"

7 / 7 papers shown
Title
Robust high dimensional learning for Lipschitz and convex losses
Robust high dimensional learning for Lipschitz and convex losses
Geoffrey Chinot
Guillaume Lecué
M. Lerasle
18
18
0
10 May 2019
Estimation bounds and sharp oracle inequalities of regularized
  procedures with Lipschitz loss functions
Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions
Pierre Alquier
V. Cottet
Guillaume Lecué
18
59
0
05 Feb 2017
Learning from MOM's principles: Le Cam's approach
Learning from MOM's principles: Le Cam's approach
Lecué Guillaume
Lerasle Matthieu
27
52
0
08 Jan 2017
On optimality of empirical risk minimization in linear aggregation
On optimality of empirical risk minimization in linear aggregation
Adrien Saumard
16
21
0
11 May 2016
SLOPE is Adaptive to Unknown Sparsity and Asymptotically Minimax
SLOPE is Adaptive to Unknown Sparsity and Asymptotically Minimax
Weijie Su
Emmanuel Candes
65
145
0
29 Mar 2015
Learning without Concentration for General Loss Functions
Learning without Concentration for General Loss Functions
S. Mendelson
55
65
0
13 Oct 2014
Learning without Concentration
Learning without Concentration
S. Mendelson
80
333
0
01 Jan 2014
1