408

Conditional inferential models: combining information for prior-free probabilistic inference

Abstract

The inferential model (IM) framework provides valid prior-free probabilistic inference by focusing on predicting unobserved auxiliary variables. But efficient IM-based inference can be challenging when this auxiliary variable is high-dimensional. Here we show that characteristics of the auxiliary variable are often fully observed and, in such cases, a simultaneous dimension reduction and information aggregation can be achieved by conditioning. This proposed conditioning strategy leads to efficient IM inference, and casts new light on Fisher's notions of sufficiency, conditional inference, and also Bayesian inference. A differential equation-driven selection of a conditional association is developed, and we prove a conditional IM validity theorem under some conditions. Some problems, however, may not admit a valid conditional IM of the standard form. For such cases, we propose a more flexible class of conditional IMs based on localization. The take-away message is that the conditional IM framework developed herein provides valid and efficient prior-free probabilistic inference in a variety of challenging problems.

View on arXiv
Comments on this paper