65
27

Logit stick-breaking priors for Bayesian density regression

Abstract

There is an increasing focus in several fields on learning how the distribution of a response variable changes with a set of predictors. Bayesian nonparametric dependent mixture models provide a useful approach to flexibly address this goal, however many representations are characterized by difficult interpretation and intractable computational methods. Motivated by these issues, we describe a flexible class of predictor-dependent infinite Gaussian mixture models, which relies on a formal characterization of the stick-breaking construction via a continuation-ratio logistic regression, within an exponential family representation. We study the theoretical properties, and leverage this result to derive analytically three computational methods of routine use in Bayesian inference, covering simple Markov Chain Monte Carlo via Gibbs sampling, the Expectation Maximization algorithm, and a variational Bayes procedure for scalable inference. The algorithms associated with these methods are made available online at https://github.com/tommasorigon/DLSBP . We additionally compare the three computational strategies in an application to the Old Faithful Geyser dataset.

View on arXiv
Comments on this paper