ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1510.08029
87
96
v1v2v3 (latest)

Sharp oracle inequalities for Least Squares estimators in shape restricted regression

27 October 2015
Pierre C. Bellec
ArXiv (abs)PDFHTML
Abstract

The performance of Least Squares estimators is studied in shape restricted regression for convex cones that include nondecreasing sequences, convex sequences and higher order cones. We derive sharp oracle inequalities for the Least Squares estimator, i.e., oracle inequalities with leading constant 1. Two types are oracle inequalities are derived. The inequalities of the first type are adaptive in the sense that the rate becomes parametric if the true sequence can be well approximated by a sequence that satisfies some low-dimensional property. The inequalities of the second type yield a rate that corresponds to the nonparametric rate of smoothness classes under a localized Gaussian width assumption. The oracle inequalities hold in deviation with exponential probability bounds and in expectation. To obtain our results, we improve the best known bounds on the statistical dimension of the cone of convex sequences, and we derive upper bounds on the statistical dimension of higher order cones. Then we construct an estimator that aggregates two projections on opposite convex cones. In isotonic regression, the estimator adapts to the best direction of monotonicity. In convex regression, the estimator mimics the best behavior among concavity and convexity. Our estimators are fully data-driven and computationally tractable.

View on arXiv
Comments on this paper