ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.10871
95
21
v1v2v3v4 (latest)

Prediction bounds for (higher order) total variation regularized least squares

24 April 2019
Francesco Ortelli
Sara van de Geer
ArXiv (abs)PDFHTML
Abstract

We establish oracle inequalities for the least squares estimator f^\hat ff^​ with penalty on the total variation of f^\hat ff^​ or on its higher order differences. Our main tool is an interpolating vector that leads to upper bounds for the effective sparsity. This allows one to show that the penalty on the kthk^{\text{th}}kth order differences leads to an estimator f^\hat ff^​ that can adapt to the number of jumps in the (k−1)th(k-1)^{\text{th}}(k−1)th order differences. We present the details for k=2, 3k=2, \ 3k=2, 3 and expose a framework for deriving the result for general k∈Nk\in \mathbb{N}k∈N.

View on arXiv
Comments on this paper