32
21

Rectified Gaussian Scale Mixtures and the Sparse Non-Negative Least Squares Problem

Abstract

In this paper we introduce a hierarchical Bayesian framework to obtain sparse and non-negative solutions to the sparse non-negative least squares problem (S-NNLS). We introduce a new family of scale mixtures, the Rectified Gaussian Scale Mixture (R-GSM), to model the sparsity enforcing prior distribution for the signal of interest. One advantage of the R-GSM prior is that through proper choice of the mixing density it encompasses a wide variety of heavy tailed distributions, such as the rectified Laplacian and rectified Student's t distributions. Similar to the Gaussian Scale Mixture (GSM) approach, a Type II Expectation-Maximization framework is developed to estimate the hyper-parameters and obtain a point estimate of the parameter of interest. In the proposed method, called rectified Sparse Bayesian Learning (R-SBL), we provide two ways to perform the Expectation step; Markov-Chain Monte-Carlo (MCMC) simulations and a simple yet effective diagonal approximation approach (DA). Through numerical experiments we show that R-SBL outperforms existing S-NNLS solvers in terms of both signal and support recovery and that the proposed DA approach admits both computational efficiency and numerical accuracy.

View on arXiv
Comments on this paper