486

Sparse Signal Processing with Linear and Non-Linear Observations: A Unified Shannon Theoretic Approach

Information Theory Workshop (ITW), 2013
Abstract

In this work we derive fundamental limits for many linear and non-linear sparse signal processing models including group testing, quantized compressive sensing, multivariate regression and observations with missing features. In general sparse signal processing problems can be characterized in terms of the following Markovian property. We are given a set of NN variables X1,X2,,XNX_1,X_2,\ldots,X_N, and there is an unknown subset S{1,2,,N}S \subset \{1,\,2,\,\ldots, N\} that are \emph{relevant} for predicting outcomes/outputs YY. In other words, when YY is conditioned on {Xk}kS\{X_k\}_{k\in S} it is conditionally independent of the other variables, {Xk}k∉S\{X_k\}_{k \not \in S}. Our goal is to identify the set SS from samples of the variables XX and the associated outcomes YY. We characterize this problem as a version of the noisy channel coding theorem. Using asymptotic information theoretic analyses, we describe mutual information formulas that provide sufficient and necessary conditions on the number of samples required to successfully recover the salient variables. This mutual information expression unifies conditions for both linear and non-linear observations. We then compute sample complexity bounds based on the mutual information expressions for different settings including group testing, quantized compressive sensing, multivariate regression and observations with missing features.

View on arXiv
Comments on this paper