Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices
From MaRDI portal
Publication:5281472
Abstract: We study the information-theoretic limits of exactly recovering the support of a sparse signal using noisy projections defined by various classes of measurement matrices. Our analysis is high-dimensional in nature, in which the number of observations , the ambient signal dimension , and the signal sparsity are all allowed to tend to infinity in a general manner. This paper makes two novel contributions. First, we provide sharper necessary conditions for exact support recovery using general (non-Gaussian) dense measurement matrices. Combined with previously known sufficient conditions, this result yields sharp characterizations of when the optimal decoder can recover a signal for various scalings of the sparsity and sample size , including the important special case of linear sparsity () using a linear scaling of observations (). Our second contribution is to prove necessary conditions on the number of observations required for asymptotically reliable recovery using a class of -sparsified measurement matrices, where the measurement sparsity corresponds to the fraction of non-zero entries per row. Our analysis allows general scaling of the quadruplet , and reveals three different regimes, corresponding to whether measurement sparsity has no effect, a minor effect, or a dramatic effect on the information-theoretic limits of the subset recovery problem.
Recommendations
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- An Information-Theoretic Study for Joint Sparsity Pattern Recovery With Different Sensing Matrices
- Approximate Sparsity Pattern Recovery: Information-Theoretic Lower Bounds
- Note on sparsity in signal recovery and in matrix identification
- Sparse signal recovery using a new class of random matrices
- An evaluation of the sparsity degree for sparse recovery with deterministic measurement matrices
- Information Theoretic Bounds for Compressed Sensing
- On the Recovery Limit of Sparse Signals Using Orthogonal Matching Pursuit
- Limits on Support Recovery of Sparse Signals via Multiple-Access Communication Techniques
- A simple Gaussian measurement bound for exact recovery of block-sparse signals
Cited in
(18)- Sparse regression: scalable algorithms and empirical performance
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Sparse classification: a scalable discrete optimization perspective
- A note on the asymptotic distribution of lasso estimator for correlated data
- Compressive sampling and rapid reconstruction of broadband frequency hopping signals with interference
- Which bridge estimator is the best for variable selection?
- Sparse high-dimensional linear regression. Estimating squared error and a phase transition
- A novel probabilistic approach for vehicle position prediction in free, partial, and full GPS outages
- Sparse sensor placement optimization for classification
- Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
- On the Fundamental Limits of Recovering Tree Sparse Vectors From Noisy Linear Measurements
- Compressive classification: where wireless communications meets machine learning
- Measurement Bounds for Sparse Signal Ensembles via Graphical Models
- Note on sparsity in signal recovery and in matrix identification
- Feature selection for data integration with mixed multiview data
- The all-or-nothing phenomenon in sparse linear regression
- An Information-Theoretic Study for Joint Sparsity Pattern Recovery With Different Sensing Matrices
- Iterative algorithm for discrete structure recovery
This page was built for publication: Information-Theoretic Limits on Sparse Signal Recovery: Dense versus Sparse Measurement Matrices
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5281472)