320

Online Belief Propagation for Topic Modeling

Abstract

Not only can online topic modeling algorithms extract topics from big data streams with constant memory requirements, but also can detect topic shifts as the data stream flows. Fast convergence speed is a desired property for batch learning topic models such as latent Dirichlet allocation (LDA), which can further facilitate developing fast online topic modeling algorithms for big data streams. In this paper, we present a novel and easy-to-implement fast belief propagation (FBP) algorithm to accelerate the convergence speed for batch learning LDA when the number of topics is large. FBP uses a dynamic scheduling scheme for asynchronous message passing, which passes only the most important subset of topic messages at each iteration for fast speed. From FBP, we derive an online belief propagation (OBP) algorithm that infers the topic distribution from the previously unseen documents incrementally by the online gradient descent. We show that OBP can converge to the local optimum of the LDA objective function within the online stochastic optimization framework. Extensive empirical studies demonstrate that OBP significantly reduces the learning time and achieves a much lower predictive perplexity when compared with that of several state-of-the-art online algorithms for LDA, including online variational Bayes (OVB) and online Gibbs sampling (OGS) algorithms.

View on arXiv
Comments on this paper