64

When is it worthwhile to jackknife? Breaking the quadratic barrier for Z-estimators

Main:27 Pages
7 Figures
Bibliography:3 Pages
Appendix:40 Pages
Abstract

Resampling methods are especially well-suited to inference with estimators that provide only "black-box'' access. Jackknife is a form of resampling, widely used for bias correction and variance estimation, that is well-understood under classical scaling where the sample size nn grows for a fixed problem. We study its behavior in application to estimating functionals using high-dimensional ZZ-estimators, allowing both the sample size nn and problem dimension dd to diverge. We begin showing that the plug-in estimator based on the ZZ-estimate suffers from a quadratic breakdown: while it is n\sqrt{n}-consistent and asymptotically normal whenever nd2n \gtrsim d^2, it fails for a broad class of problems whenever nd2n \lesssim d^2. We then show that under suitable regularity conditions, applying a jackknife correction yields an estimate that is n\sqrt{n}-consistent and asymptotically normal whenever nd3/2n\gtrsim d^{3/2}. This provides strong motivation for the use of jackknife in high-dimensional problems where the dimension is moderate relative to sample size. We illustrate consequences of our general theory for various specific ZZ-estimators, including non-linear functionals in linear models; generalized linear models; and the inverse propensity score weighting (IPW) estimate for the average treatment effect, among others.

View on arXiv
Comments on this paper