95
v1v2 (latest)

Box-constrained monotone LL_\infty-approximations to Lipschitz regularizations, with applications to robust testing

Abstract

Tests of fit to exact models in statistical analysis often lead to rejections even when the model is a useful approximate description of the random generator of the data. Among possible relaxations of a fixed model, the one defined by contamination neighbourhoods, namely, Vα(P0)={(1α)P0+αQ:QP}\mathcal{V}_\alpha(P_0)=\{(1-\alpha)P_0+\alpha Q: Q \in \mathcal{P}\}, where P\mathcal{P} is the set of all probabilities in the sample space, has received much attention, from its central role in Robust Statistics. For probabilities on the real line, consistent tests of fit to Vα(P0)\mathcal{V}_\alpha(P_0) can be based on dK(P0,Rα(P))d_K(P_0,R_\alpha(P)), the minimal Kolmogorov distance between P0P_0 and the set of trimmings of PP, Rα(P)={P~P:P~P,dP~dP11αP-a.s.}R_\alpha(P)=\big\{\tilde P\in\mathcal{P}:\tilde P\ll P,\,{\textstyle \frac{d\tilde P}{dP}\leq\frac{1}{1-\alpha}}\, P\text{-a.s.}\big\}. We show that this functional admits equivalent formulations in terms of, either best approximation in uniform norm by LL-Lipschitz functions satisfying a box constraint, or as the best monotone approximation in uniform norm to the LL-Lipschitz regularization, which is seen to be expressable in terms of the average of the Pasch-Hausdorff envelopes. This representation for the solution of the variational problem allows to obtain results showing stability of the functional dK(P0,Rα(P))d_K(P_0,R_\alpha(P)), as well as directional differentiability, providing the basis for a Central Limit Theorem for that functional.

View on arXiv
Comments on this paper