On the acceleration of some empirical means with application to
nonparametric regression
Let be an i.i.d. sequence of random variables in , , for some function , under regularity conditions, we show that \begin{align*} n^{1/2} \left(n^{-1} \sum_{i=1}^n \frac{\varphi(X_i)}{\w f^{(i)}(X_i)}-\int_{} \varphi(x)dx \right) \overset{\P}{\lr} 0, \end{align*} where is the classical leave-one-out kernel estimator of the density of . This result is striking because it speeds up traditional rates, in root , derived from the central limit theorem when . As a consequence, it improves the classical Monte Carlo procedure for integral approximation. The paper mainly addressed with theoretical issues related to the later result (rates of convergence, bandwidth choice, regularity of ) but also interests some statistical applications dealing with random design regression. In particular, we provide the asymptotic normality of the estimation of the linear functionals of a regression function on which the only requirement is the H\"older regularity. This leads us to a new version of the \textit{average derivative estimator} introduced by H\"ardle and Stoker in \cite{hardle1989} which allows for \textit{dimension reduction} by estimating the \textit{index space} of a regression.
View on arXiv