Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
From MaRDI portal
Publication:3548002
DOI10.1109/TIT.2006.885507zbMATH Open1309.94033arXivmath/0410542OpenAlexW2129638195WikidataQ56813489 ScholiaQ56813489MaRDI QIDQ3548002FDOQ3548002
Authors: Emmanuel J. Candès, Terence Tao
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Suppose we are given a vector in . How many linear measurements do we need to make about to be able to recover to within precision in the Euclidean () metric? Or more exactly, suppose we are interested in a class of such objects--discrete digital signals, images, etc; how many linear measurements do we need to recover objects from this class to within accuracy ? This paper shows that if the objects of interest are sparse or compressible in the sense that the reordered entries of a signal decay like a power-law (or if the coefficient sequence of in a fixed basis decays like a power-law), then it is possible to reconstruct to within very high accuracy from a small number of random measurements.
Full work available at URL: https://arxiv.org/abs/math/0410542
Recommendations
Cited In (only showing first 100 items - show all)
- Coorbit theory, multi-\(\alpha \)-modulation frames, and the concept of joint sparsity for medical multichannel data analysis
- Sparse approximate solution of partial differential equations
- Wavelet denoising via sparse representation
- On the volume of unit balls of finite-dimensional Lorentz spaces
- The stochastic geometry of unconstrained one-bit data compression
- Signal recovery under cumulative coherence
- Sparse signal representation by adaptive non-uniform B-spline dictionaries on a compact interval
- Sparse recovery from inaccurate saturated measurements
- On the uniqueness of sparse time-frequency representation of multiscale data
- Sparse time-frequency decomposition for multiple signals with same frequencies
- Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO
- Extraction of intrawave signals using the sparse time-frequency representation method
- A Gradient-Enhanced L1 Approach for the Recovery of Sparse Trigonometric Polynomials
- On generating optimal sparse probabilistic Boolean networks with maximum entropy from a positive stationary distribution.
- A theoretical result of sparse signal recovery via alternating projection method
- Robust multi-image processing with optimal sparse regularization
- Compressive sensing for multi-static scattering analysis
- Sparse approximate solution of fitting surface to scattered points by MLASSO model
- Low-rank matrix completion in a general non-orthogonal basis
- Convergence rates of learning algorithms by random projection
- Comparison of parametric sparse recovery methods for ISAR image formation
- Stability of the elastic net estimator
- Prediction of protein-protein interaction by metasample-based sparse representation
- Expander \(\ell_0\)-decoding
- TV+TV regularization with nonconvex sparseness-inducing penalty for image restoration
- Linearized alternating directions method for \(\ell_1\)-norm inequality constrained \(\ell_1\)-norm minimization
- Noisy 1-bit compressive sensing: models and algorithms
- Sampling in the analysis transform domain
- Introducing the counter mode of operation to compressed sensing based encryption
- A quasi-Newton algorithm for nonconvex, nonsmooth optimization with global convergence guarantees
- From compression to compressed sensing
- Sparse sensor placement optimization for classification
- Primal-dual first-order methods for a class of cone programming
- Generalizing CoSaMP to signals from a union of low dimensional linear subspaces
- Representation and coding of signal geometry
- A two-step iterative algorithm for sparse hyperspectral unmixing via total variation
- Enhanced total variation minimization for stable image reconstruction
- The matrix splitting based proximal fixed-point algorithms for quadratically constrained \(\ell_{1}\) minimization and Dantzig selector
- Analysis of the equivalence relationship between \(l_{0}\)-minimization and \(l_{p}\)-minimization
- The recovery guarantee for orthogonal matching pursuit method to reconstruct sparse polynomials
- Compressed sensing with structured sparsity and structured acquisition
- Smoothing projected Barzilai-Borwein method for constrained non-Lipschitz optimization
- From low- to high-dimensional moments without magic
- A hierarchical framework for recovery in compressive sensing
- Local variable selection of nonlinear nonparametric systems by first order expansion
- Deterministic construction of compressed sensing matrices based on semilattices
- Sparse probit linear mixed model
- Two new lower bounds for the spark of a matrix
- The local convexity of solving systems of quadratic equations
- Sparsity and incoherence in orthogonal matching pursuit
- Compressive sampling of ensembles of correlated signals
- Strengthening hash families and compressive sensing
- Recovering an unknown signal completely submerged in strong noise by a new stochastic resonance method
- Deterministic sampling of sparse trigonometric polynomials
- Inverse scale space decomposition
- Phase retrieval from incomplete magnitude information via total variation regularization
- Consistency of \(\ell_1\) recovery from noisy deterministic measurements
- A new bound on the block restricted isometry constant in compressed sensing
- Stable image reconstruction using transformed total variation minimization
- Greedy signal space methods for incoherence and beyond
- Sparse broadband beamformer design via proximal optimization Techniques
- Compressed sensing based on trust region method
- Recovering network topologies via Taylor expansion and compressive sensing
- Title not available (Why is that?)
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Sparsity and incoherence in compressive sampling
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- Sparse Legendre expansions via \(\ell_1\)-minimization
- On error correction with errors in both the channel and syndrome
- Strong convergence of a modified proximal algorithm for solving the lasso
- An implementable proximal point algorithmic framework for nuclear norm minimization
- Restricted normal cones and sparsity optimization with affine constraints
- The Gelfand widths of \(\ell_p\)-balls for \(0 < p \leq 1\)
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- Augmented Lagrangian alternating direction method for matrix separation based on low-rank factorization
- An efficient augmented Lagrangian method with applications to total variation minimization
- Dimensionality reduction with subgaussian matrices: a unified theory
- Solution of the problem on image reconstruction in computed tomography
- A note on the complexity of \(L _{p }\) minimization
- Compressive sensing-based topology identification of multilayer networks
- Properties and iterative methods for the lasso and its variants
- A significance test for the lasso
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Sparse recovery by non-convex optimization - instance optimality
- Guarantees of total variation minimization for signal recovery
- Super-resolution of point sources via convex programming
- Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit
- Sparse recovery under weak moment assumptions
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Random projections of smooth manifolds
- Compressive sensing with local geometric features
- A null space analysis of the \(\ell_1\)-synthesis method in dictionary-based compressed sensing
- Fast Phase Retrieval from Local Correlation Measurements
- Discussion: ``A significance test for the lasso
- Accelerating gradient projection methods for \(\ell _1\)-constrained signal recovery by steplength selection rules
- Gaussian approximations in high dimensional estimation
- Total variation wavelet inpainting
- Testing the nullspace property using semidefinite programming
- The weighted majority algorithm
This page was built for publication: Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3548002)