45
0

Smoothed SGD for quantiles: Bahadur representation and Gaussian approximation

Main:9 Pages
6 Figures
Bibliography:2 Pages
3 Tables
Appendix:11 Pages
Abstract

This paper considers the estimation of quantiles via a smoothed version of the stochastic gradient descent (SGD) algorithm. By smoothing the score function in the conventional SGD quantile algorithm, we achieve monotonicity in the quantile level in that the estimated quantile curves do not cross. We derive non-asymptotic tail probability bounds for the smoothed SGD quantile estimate both for the case with and without Polyak-Ruppert averaging. For the latter, we also provide a uniform Bahadur representation and a resulting Gaussian approximation result. Numerical studies show good finite sample behavior for our theoretical results.

View on arXiv
@article{chen2025_2505.13299,
  title={ Smoothed SGD for quantiles: Bahadur representation and Gaussian approximation },
  author={ Likai Chen and Georg Keilbar and Wei Biao Wu },
  journal={arXiv preprint arXiv:2505.13299},
  year={ 2025 }
}
Comments on this paper