78

Risk Bounds for Quantile Additive Trend Filtering

International Conference on Artificial Intelligence and Statistics (AISTATS), 2023
Abstract

This paper investigates risk bounds for quantile additive trend filtering, a method gaining increasing significance in the realms of additive trend filtering and quantile regression. We investigate the constrained version of quantile trend filtering within additive models, considering both fixed and growing input dimensions. In the fixed dimension case, we discover an error rate that mirrors the non-quantile minimax rate for additive trend filtering, featuring the main term n2r/(2r+1)V2/(2r+1)n^{-2r/(2r+1)}V^{2/(2r+1)}, when the underlying quantile function is additive, with components whose (r1)(r-1)th derivatives are of bounded variation by VV. In scenarios with a growing input dimension dd, quantile additive trend filtering introduces a polynomial factor of d(2r+2)/(2r+1)d^{(2r+2)/(2r+1)}. This aligns with the non-quantile variant, featuring a linear factor dd, particularly pronounced for larger rr values. Additionally, we propose a practical algorithm for implementing quantile trend filtering within additive models, using dimension-wise backfitting. We conduct experiments with evenly spaced data points or data that samples from a uniform distribution in the interval [0,1][0,1], applying distinct component functions and introducing noise from normal and heavy-tailed distributions. Our findings confirm the estimator's convergence as nn increases and its superiority, particularly in heavy-tailed distribution scenarios. These results deepen our understanding of additive trend filtering models in quantile settings, offering valuable insights for practical applications and future research.

View on arXiv
Comments on this paper