Conditional Matrix Flows for Gaussian Graphical Models
Studying conditional independence structure among many variables with few observations is a challenging task. Gaussian Graphical Models (GGMs) tackle this problem by encouraging sparsity in the precision matrix through an regularization with . However, since the objective is highly non-convex for sub- pseudo-norms, most approaches rely on the norm. In this case frequentist approaches allow to elegantly compute the solution path as a function of the shrinkage parameter . Instead of optimizing the penalized likelihood, the Bayesian formulation introduces a Laplace prior on the precision matrix. However, posterior inference for different values requires repeated runs of expensive Gibbs samplers. We propose a very general framework for variational inference in GGMs that unifies the benefits of frequentist and Bayesian frameworks. Specifically, we propose to approximate the posterior with a matrix-variate Normalizing Flow defined on the space of symmetric positive definite matrices. As a key improvement on previous work, we train a continuum of sparse regression models jointly for all regularization parameters and all norms, including non-convex sub- pseudo-norms. This is achieved by conditioning the flow on and on the shrinkage parameter . We have then access with one model to (i) the evolution of the posterior for any and for any (pseudo-) norms, (ii) the marginal log-likelihood for model selection, and (iii) we can recover the frequentist solution paths as the MAP, which is obtained through simulated annealing.
View on arXiv