Tractable Bayesian Density Regression via Logit Stick-Breaking Priors

There is an increasing interest in learning how the distribution of a response variable changes with a set of predictors. Bayesian nonparametric dependent mixture models provide a useful approach to flexibly address this goal. However, several representations are characterized by difficult interpretation and intractable computational methods. Motivated by these issues, we describe a class of predictor-dependent infinite mixtures of Gaussians, which relies on a formal characterization of the stick-breaking construction via a continuation-ratio logistic regression, within an exponential family representation. The proposed formulation maintains the same desirable properties of popular predictor-dependent stick-breaking priors, but leverages a recent Polya-Gamma data augmentation and a formal connection with hierarchical mixtures of experts models to facilitate tractable inference under a broader variety of routine-use computational methods. These methods include Markov Chain Monte Carlo via Gibbs sampling, Expectation Maximization algorithms, and a variational Bayes for scalable inference. The algorithms associated with these procedures are tested in a toxicology study.
View on arXiv