385

Adaptive Bayesian nonparametric regression using mixtures of kernels

Bayesian Analysis (BA), 2015
Abstract

Recently, interest in a Bayesian nonparametric approach to the sparse regression problem emerged after initial works from Abramovich et al. (2000) and Wolpert et al. (2011). The underlying probabilistic model is analogous to infinite dimensional density mixture models and parsimony is intrinsically induced by the choice of the mixing measure. In contrast with density estimation, mixture components must be replaced by suitable kernels function, chosen so that they span the function space of interest. We consider kernels arising from representations of topological groups (see Ali et al., 2000), with an emphasis on continuous wavelets and Gabor atoms. Modeling the regression function as a mixture of kernels also requires the use of generalized mixing measures, taking reals or complex values. We propose a simple construction based on Completely Random Measures (Kingman, 1967), allowing us to make simple analysis of prior distribution and to get posterior consistency results in gaussian regression setting. Due to similarities with density mixture models, efficient sampling schemes can be proposed, requiring however some adaptions. For a particular class of generalized mixing measures we propose a Gibbs sampler based on algorithm from Neal (2000). It is worth noting that the algorithm allows to sample the full posterior distribution, permitting the estimation of credible bands. As an application, the algorithm is compared to classical wavelets thresholding methods in gaussian regression on the collection of test functions from Marron et al. (1998).

View on arXiv
Comments on this paper