Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
From MaRDI portal
Publication:3548002
DOI10.1109/TIT.2006.885507zbMATH Open1309.94033arXivmath/0410542OpenAlexW2129638195WikidataQ56813489 ScholiaQ56813489MaRDI QIDQ3548002FDOQ3548002
Authors: Emmanuel J. Candès, Terence Tao
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Suppose we are given a vector in . How many linear measurements do we need to make about to be able to recover to within precision in the Euclidean () metric? Or more exactly, suppose we are interested in a class of such objects--discrete digital signals, images, etc; how many linear measurements do we need to recover objects from this class to within accuracy ? This paper shows that if the objects of interest are sparse or compressible in the sense that the reordered entries of a signal decay like a power-law (or if the coefficient sequence of in a fixed basis decays like a power-law), then it is possible to reconstruct to within very high accuracy from a small number of random measurements.
Full work available at URL: https://arxiv.org/abs/math/0410542
Recommendations
Cited In (only showing first 100 items - show all)
- Compressed sensing of color images
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
- Additive combinatorics: with a view towards computer science and cryptography -- an exposition
- Signature codes for weighted noisy adder channel, multimedia fingerprinting and compressed sensing
- Learning semidefinite regularizers
- New analysis of manifold embeddings and signal recovery from compressive measurements
- Generalized sampling and infinite-dimensional compressed sensing
- Yang-Baxter equations in quantum information
- A least-squares method for sparse low rank approximation of multivariate functions
- Stochastic collocation algorithms using \(l_1\)-minimization for Bayesian solution of inverse problems
- On the generation of sampling schemes for magnetic resonance imaging
- Generalized Kalman smoothing: modeling and algorithms
- Sharp support recovery from noisy random measurements by \(\ell_1\)-minimization
- Sparsity enforcing edge detection method for blurred and noisy Fourier data
- Uniform recovery in infinite-dimensional compressed sensing and applications to structured binary sampling
- Sparse regression learning by aggregation and Langevin Monte-Carlo
- Fast and RIP-optimal transforms
- Near oracle performance and block analysis of signal space greedy methods
- Improved sparse Fourier approximation results: Faster implementations and stronger guarantees
- 2D sparse signal recovery via 2D orthogonal matching pursuit
- Greedy-like algorithms for the cosparse analysis model
- Instance-optimality in probability with an \(\ell _1\)-minimization decoder
- Sparse system identification for stochastic systems with general observation sequences
- Compressed sensing by inverse scale space and curvelet thresholding
- Compressed sensing with preconditioning for sparse recovery with subsampled matrices of Slepian prolate functions
- Nonlinear least squares in \(\mathbb R^{N}\)
- Improving the \( k\)-\textit{compressibility} of hyper reduced order models with moving sources: applications to welding and phase change problems
- Optimization methods for synthetic aperture radar imaging
- Fast \(\ell _{1}\) minimization by iterative thresholding for multidimensional NMR spectroscopy
- Majorizing measures and proportional subsets of bounded orthonormal systems
- Title not available (Why is that?)
- Robust face recognition via block sparse Bayesian learning
- Nonmonotone Barzilai-Borwein gradient algorithm for \(\ell_1\)-regularized nonsmooth minimization in compressive sensing
- A modified Newton projection method for \(\ell _1\)-regularized least squares image deblurring
- Sparse time-frequency representation of nonlinear and nonstationary data
- A sharp nonasymptotic bound and phase diagram of \(L_{1/2}\) regularization
- Compressive optical deflectometric tomography: a constrained total-variation minimization approach
- Deterministic construction of sparse binary matrices via incremental integer optimization
- A convergent overlapping domain decomposition method for total variation minimization
- Log-concavity and strong log-concavity: a review
- Gaussian averages of interpolated bodies and applications to approximate reconstruction
- Compressed subspace matching on the continuum
- High-dimensional inference in misspecified linear models
- On the null space property of \(l_q\)-minimization for \(0 < q \leq 1\) in compressed sensing
- A Barzilai-Borwein type method for minimizing composite functions
- The null space property for sparse recovery from multiple measurement vectors
- Rank-based model selection for multiple ions quantum tomography
- Uniform recovery of fusion frame structured sparse signals
- Signature codes for noisy multiple access adder channel
- On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery
- On the Absence of Uniform Recovery in Many Real-World Applications of Compressed Sensing and the Restricted Isometry Property and Nullspace Property in Levels
- Sparse regularization for semi-supervised classification
- Dense fast random projections and Lean Walsh transforms
- Compressive imaging and characterization of sparse light deflection maps
- A modified greedy analysis pursuit algorithm for the cosparse analysis model
- Decomposable norm minimization with proximal-gradient homotopy algorithm
- Yang-Baxter equations and quantum entanglements
- Finding a low-rank basis in a matrix subspace
- Sparse decomposition by iterating Lipschitzian-type mappings
- Sparse signal recovery using a new class of random matrices
- Compressed history matching: Exploiting transform-domain sparsity for regularization of nonlinear dynamic data integration problems
- Recovering network topologies via Taylor expansion and compressive sensing
- Title not available (Why is that?)
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Sparsity and incoherence in compressive sampling
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- Sparse Legendre expansions via \(\ell_1\)-minimization
- On error correction with errors in both the channel and syndrome
- Strong convergence of a modified proximal algorithm for solving the lasso
- An implementable proximal point algorithmic framework for nuclear norm minimization
- Restricted normal cones and sparsity optimization with affine constraints
- The Gelfand widths of \(\ell_p\)-balls for \(0 < p \leq 1\)
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- Augmented Lagrangian alternating direction method for matrix separation based on low-rank factorization
- An efficient augmented Lagrangian method with applications to total variation minimization
- Dimensionality reduction with subgaussian matrices: a unified theory
- Solution of the problem on image reconstruction in computed tomography
- A note on the complexity of \(L _{p }\) minimization
- Compressive sensing-based topology identification of multilayer networks
- Properties and iterative methods for the lasso and its variants
- A significance test for the lasso
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Sparse recovery by non-convex optimization - instance optimality
- Guarantees of total variation minimization for signal recovery
- Super-resolution of point sources via convex programming
- Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit
- Sparse recovery under weak moment assumptions
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Random projections of smooth manifolds
- Compressive sensing with local geometric features
- A null space analysis of the \(\ell_1\)-synthesis method in dictionary-based compressed sensing
- Fast Phase Retrieval from Local Correlation Measurements
- Discussion: ``A significance test for the lasso
- Accelerating gradient projection methods for \(\ell _1\)-constrained signal recovery by steplength selection rules
- Gaussian approximations in high dimensional estimation
- Total variation wavelet inpainting
This page was built for publication: Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3548002)