40
5

A nested expectation-maximization algorithm for latent class regression models

Abstract

Latent class regression models characterize the joint distribution of a multivariate categorical random variable under an assumption of conditional independence given a predictor-dependent latent class variable. Although these models are popular choices in several fields, current computational procedures based on the expectation-maximization (EM) algorithm require gradient methods to facilitate the derivations for the maximization step. However, these procedures do not provide monotone loglikelihood sequences, thereby leading to algorithms which may not guarantee reliable convergence. To address this issue, we propose a nested EM algorithm, which relies on a sequence of conditional expectation-maximizations for the regression coefficients associated with each predictor-dependent latent class. Leveraging the recent P\'olya-gamma data augmentation for logistic regression, the conditional expectation-maximizations reduce to a set of generalized least squares minimization problems. This method is a direct consequence of an exact EM algorithm which we develop for the special case of two latent classes. We show that the proposed computational methods provide a monotone loglikelihood sequence, and discuss the improved performance in two real data applications.

View on arXiv
Comments on this paper