Model-free Bootstrap and Conformal Prediction in Regression: Conditionality, Conjecture Testing, and Pertinent Prediction Intervals

Predictive inference under a general regression setting is gaining more interest in the big-data era. In terms of going beyond point prediction to develop prediction intervals, two main threads of development are conformal prediction and Model-free prediction. Recently, \cite{distributionalconformal} proposed a new conformal prediction approach exploiting the same uniformization procedure as in the Model-free Bootstrap of \cite{Politis2015}. Hence, it is of interest to compare and further investigate the performance of the two methods. In the paper at hand, we contrast the two approaches via theoretical analysis and numerical experiments with a focus on conditional coverage of prediction intervals. We discuss suitable scenarios for applying each algorithm, underscore the importance of conditional vs.~unconditional coverage, and show that, under mild conditions, the Model-free bootstrap yields prediction intervals with guaranteed better conditional coverage compared to quantile estimation. We also extend the concept of `pertinence' of prediction intervals in \cite{Politis2015} to the nonparametric regression setting, and give concrete examples where its importance emerges under finite sample scenarios. Finally, we define the new notion of `conjecture testing' that is the analog of hypothesis testing as applied to the prediction problem; we also devise a modified conformal score to allow conformal prediction to handle one-sided `conjecture tests', and compare to the Model-free bootstrap.
View on arXiv