Sparse Signal Processing with Linear and Non-Linear Observations: A
Unified Shannon Theoretic Approach
In this work we derive fundamental limits for many linear and non-linear sparse signal processing models including group testing, quantized compressive sensing, multivariate regression and observations with missing features. In general sparse signal processing problems can be characterized in terms of the following Markovian property. We are given a set of variables , and there is an unknown subset that are \emph{relevant} for predicting outcomes/outputs . In other words, when is conditioned on it is conditionally independent of the other variables, . Our goal is to identify the set from samples of the variables and the associated outcomes . We characterize this problem as a version of the noisy channel coding theorem. Using asymptotic information theoretic analyses, we describe mutual information formulas that provide sufficient and necessary conditions on the number of samples required to successfully recover the salient variables. This mutual information expression unifies conditions for both linear and non-linear observations. We then compute sample complexity bounds based on the mutual information expressions for different settings including group testing, quantized compressive sensing, multivariate regression and observations with missing features.
View on arXiv