93
324

On the Effect of Bias Estimation on Coverage Accuracy in Nonparametric Inference

Abstract

Nonparametric methods play a central role in modern empirical work. While they provide inference procedures that are more robust to parametric misspecification bias, they may be quite sensitive to tuning parameter choices. We study the effects of bias correction on confidence interval coverage in the context of kernel density and local polynomial regression estimation, and prove that bias correction can be preferred to undersmoothing for minimizing coverage error. This result is achieved using a novel, yet simple, Studentization, which leads to a new way of constructing kernel-based bias-corrected confidence intervals. We also derive coverage error optimal bandwidths, and discuss easy-to-implement bandwidth selection procedures. In particular, we show that the MSE-optimal bandwidth delivers the fastest coverage error decay rate only at interior points when second-order (equivalent) kernels are employed, but is otherwise suboptimal both at interior and boundary points. All the results are established using valid Edgeworth expansions and illustrated with simulated data. Our findings have important consequences for empirical work as they indicate that bias-corrected confidence intervals, coupled with appropriate standard errors, have smaller coverage errors and therefore are less sensitive to tuning parameter choices. To illustrate the applicability of our results, we study inference in regression discontinuity (RD) designs, where we establish the same coverage error and robustness improvements for bias-corrected confidence intervals, and also give a simple rule-of-thumb bandwidth choice for their implementation based on correcting the MSE-optimal bandwidth. For example, for the popular local-linear RD estimator and a sample size of n=500n=500, shrinking the MSE-optimal bandwidth by 27%27\% leads to bias-corrected confidence intervals with the fastest coverage error decay rate.

View on arXiv
Comments on this paper