Information Theoretic Bounds for Compressed Sensing
From MaRDI portal
Publication:5281264
DOI10.1109/TIT.2010.2059891zbMATH Open1366.94179arXiv0804.3439MaRDI QIDQ5281264FDOQ5281264
Authors: Shuchin Aeron, Venkatesh Saligrama, Manqi Zhao
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: In this paper we derive information theoretic performance bounds to sensing and reconstruction of sparse phenomena from noisy projections. We consider two settings: output noise models where the noise enters after the projection and input noise models where the noise enters before the projection. We consider two types of distortion for reconstruction: support errors and mean-squared errors. Our goal is to relate the number of measurements, , and , to signal sparsity, , distortion level, , and signal dimension, . We consider support errors in a worst-case setting. We employ different variations of Fano's inequality to derive necessary conditions on the number of measurements and required for exact reconstruction. To derive sufficient conditions we develop new insights on max-likelihood analysis based on a novel superposition property. In particular this property implies that small support errors are the dominant error events. Consequently, our ML analysis does not suffer the conservatism of the union bound and leads to a tighter analysis of max-likelihood. These results provide order-wise tight bounds. For output noise models we show that asymptotically an of together with measurements is necessary and sufficient for exact support recovery. Furthermore, if a small fraction of support errors can be tolerated, a constant turns out to be sufficient in the linear sparsity regime. In contrast for input noise models we show that support recovery fails if the number of measurements scales as implying poor compression performance for such cases. We also consider Bayesian set-up and characterize tradeoffs between mean-squared distortion and the number of measurements using rate-distortion theory.
Full work available at URL: https://arxiv.org/abs/0804.3439
Information theory (general) (94A15) Measures of information, entropy (94A17) Rate-distortion theory in information and communication theory (94A34)
Cited In (18)
- Numerical characterization of support recovery in sparse regression with correlated design
- Fano's inequality for random variables
- Computational approaches to non-convex, sparsity-inducing multi-penalty regularization
- Compressed Sensing Performance Bounds Under Poisson Noise
- Non-uniform recovery guarantees for binary measurements and infinite-dimensional compressed sensing
- Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices
- A strong converse bound for multiple hypothesis testing, with applications to high-dimensional estimation
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Rényi Information Dimension: Fundamental Limits of Almost Lossless Analog Compression
- Which bridge estimator is the best for variable selection?
- Rigorous restricted isometry property of low-dimensional subspaces
- Adaptive multi-penalty regularization based on a generalized Lasso path
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- Compressed Sensing with Side Information on the Feasible Region
- Sparse microwave imaging: principles and applications
- The all-or-nothing phenomenon in sparse linear regression
- Iterative algorithm for discrete structure recovery
- Asymptotic Achievability of the CramÉr–Rao Bound for Noisy Compressive Sampling
This page was built for publication: Information Theoretic Bounds for Compressed Sensing
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281264)