Risk Bounds for Quantile Trend Filtering

We study quantile trend filtering, a recently proposed method for nonparametric quantile regression with the goal of generalizing existing risk bounds known for the usual trend filtering estimators which perform mean regression. We study both the penalized and the constrained version (of order ) of univariate quantile trend filtering. Our results show that both the constrained and the penalized version (of order ) attain the minimax rate up to log factors, when the th discrete derivative of the true vector of quantiles belongs to the class of bounded variation signals. Moreover we also show that if the true vector of quantiles is a discrete spline with a few polynomial pieces then both versions attain a near parametric rate of convergence. Corresponding results for the usual trend filtering estimators are known to hold only when the errors are sub-Gaussian. In contrast, our risk bounds are shown to hold under minimal assumptions on the error variables. In particular, no moment assumptions are needed and our results hold under heavy-tailed errors. %On the other hand, we prove all our results for a Huber type loss which can be smaller than the mean squared error loss employed for showing risk bounds for usual trend filtering. Our proof techniques are general and thus can potentially be used to study other nonparametric quantile regression methods. To illustrate this generality we also employ our proof techniques to obtain new results for multivariate quantile total variation denoising and high dimensional quantile linear regression.
View on arXiv