Tractable Bayesian Density Regression via Logit Stick-Breaking Priors

There is an increasing interest in learning how the distribution of a response variable changes with a set of predictors. Bayesian nonparametric dependent mixture models provide a useful approach to flexibly address this goal. However, many formulations are characterized by difficult interpretation and intractable computational methods. Motivated by these issues, we define a class of predictor-dependent infinite mixture models, which relies on a formal representation of the stick-breaking construction via a continuation-ratio logistic regression, within an exponential family representation. This formulation maintains the same desirable properties of popular predictor-dependent stick-breaking priors, but leverages a recent Polya-Gamma data augmentation to facilitate tractable inference under a broader variety of routine-use computational methods. These methods include Markov Chain Monte Carlo via Gibbs sampling, Expectation Maximization algorithms, and a variational Bayes routine for scalable inference. The algorithms associated with these methods are tested in a toxicology study.
View on arXiv