ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.01402
18
59

Estimation bounds and sharp oracle inequalities of regularized procedures with Lipschitz loss functions

5 February 2017
Pierre Alquier
V. Cottet
Guillaume Lecué
ArXivPDFHTML
Abstract

We obtain estimation error rates and sharp oracle inequalities for regularization procedures of the form \begin{equation*} \hat f \in argmin_{f\in F}\left(\frac{1}{N}\sum_{i=1}^N\ell(f(X_i), Y_i)+\lambda \|f\|\right) \end{equation*} when ∥⋅∥\|\cdot\|∥⋅∥ is any norm, FFF is a convex class of functions and ℓ\ellℓ is a Lipschitz loss function satisfying a Bernstein condition over FFF. We explore both the bounded and subgaussian stochastic frameworks for the distribution of the f(Xi)f(X_i)f(Xi​)'s, with no assumption on the distribution of the YiY_iYi​'s. The general results rely on two main objects: a complexity function, and a sparsity equation, that depend on the specific setting in hand (loss ℓ\ellℓ and norm ∥⋅∥\|\cdot\|∥⋅∥). As a proof of concept, we obtain minimax rates of convergence in the following problems: 1) matrix completion with any Lipschitz loss function, including the hinge and logistic loss for the so-called 1-bit matrix completion instance of the problem, and quantile losses for the general case, which enables to estimate any quantile on the entries of the matrix; 2) logistic LASSO and variants such as the logistic SLOPE; 3) kernel methods, where the loss is the hinge loss, and the regularization function is the RKHS norm.

View on arXiv
Comments on this paper