Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
From MaRDI portal
Publication:3548002
Abstract: Suppose we are given a vector in . How many linear measurements do we need to make about to be able to recover to within precision in the Euclidean () metric? Or more exactly, suppose we are interested in a class of such objects--discrete digital signals, images, etc; how many linear measurements do we need to recover objects from this class to within accuracy ? This paper shows that if the objects of interest are sparse or compressible in the sense that the reordered entries of a signal decay like a power-law (or if the coefficient sequence of in a fixed basis decays like a power-law), then it is possible to reconstruct to within very high accuracy from a small number of random measurements.
Recommendations
Cited in
(only showing first 100 items - show all)- Exact reconstruction using Beurling minimal extrapolation
- Sparse recovery under weak moment assumptions
- Global testing under sparse alternatives: ANOVA, multiple comparisons and the higher criticism
- A non-adapted sparse approximation of PDEs with stochastic inputs
- The weighted majority algorithm
- Random projections of smooth manifolds
- Stability and instance optimality for Gaussian measurements in compressed sensing
- A novel measurement matrix based on regression model for block compressed sensing
- Solution of the problem on image reconstruction in computed tomography
- Linearized Bregman iterations for compressed sensing
- Signal Reconstruction From Noisy Random Projections
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Discussion: ``A significance test for the lasso
- The Gelfand widths of \(\ell_p\)-balls for \(0 < p \leq 1\)
- Compressive sensing-based topology identification of multilayer networks
- Testable uniqueness conditions for empirical assessment of undersampling levels in total variation-regularized X-ray CT
- A superlinearly convergent \(R\)-regularized Newton scheme for variational models with concave sparsity-promoting priors
- Recovering network topologies via Taylor expansion and compressive sensing
- R3P-Loc: a compact multi-label predictor using ridge regression and random projection for protein subcellular localization
- Discussion: ``A significance test for the lasso
- Decoding by Linear Programming
- Accelerating gradient projection methods for \(\ell _1\)-constrained signal recovery by steplength selection rules
- Noncommutative Bennett and Rosenthal inequalities
- Exact recovery of non-uniform splines from the projection onto spaces of algebraic polynomials
- Sparsity and incoherence in compressive sampling
- scientific article; zbMATH DE number 7750674 (Why is no real title available?)
- Gaussian approximations in high dimensional estimation
- One-bit compressed sensing by linear programming
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- Suprema of chaos processes and the restricted isometry property
- Compressive sensing with local geometric features
- Total variation wavelet inpainting
- Explicit constructions of RIP matrices and related problems
- Robust sparse phase retrieval made easy
- Properties and iterative methods for the lasso and its variants
- Augmented Lagrangian alternating direction method for matrix separation based on low-rank factorization
- Best subset selection via a modern optimization lens
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
- A statistical mechanics approach to de-biasing and uncertainty estimation in Lasso for random measurements
- High-dimensional inference: confidence intervals, \(p\)-values and R-software \texttt{hdi}
- Counting faces of randomly projected polytopes when the projection radically lowers dimension
- How well can we estimate a sparse vector?
- A null space analysis of the \(\ell_1\)-synthesis method in dictionary-based compressed sensing
- Extensions of compressed sensing
- An alternating minimization method for matrix completion problems
- An efficient augmented Lagrangian method with applications to total variation minimization
- A significance test for the lasso
- Multigrid with Rough Coefficients and Multiresolution Operator Decomposition from Hierarchical Information Games
- Steiner equiangular tight frames
- Exact matrix completion via convex optimization
- Enhancing sparsity by reweighted \(\ell _{1}\) minimization
- A model of regularization parameter determination in low-dose X-ray CT reconstruction based on dictionary learning
- IMRO: A proximal quasi-Newton method for solving \(\ell_1\)-regularized least squares problems
- A note on the complexity of \(L _{p }\) minimization
- Compressed sensing and best \(k\)-term approximation
- Robustness properties of dimensionality reduction with Gaussian random matrices
- Anomaly detection in large-scale data stream networks
- On sparse reconstruction from Fourier and Gaussian measurements
- Asymptotic analysis of the role of spatial sampling for covariance parameter estimation of Gaussian processes
- Fast Phase Retrieval from Local Correlation Measurements
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Testing the nullspace property using semidefinite programming
- Sparse Legendre expansions via \(\ell_1\)-minimization
- Stable signal recovery from incomplete and inaccurate measurements
- Sparse recovery by non-convex optimization - instance optimality
- $\ell _0$ Minimization for wavelet frame based image restoration
- On error correction with errors in both the channel and syndrome
- Variations on a theorem of Candès, Romberg and Tao
- The Littlewood-Offord problem and invertibility of random matrices
- Strong convergence of a modified proximal algorithm for solving the lasso
- Guarantees of total variation minimization for signal recovery
- Super-resolution of point sources via convex programming
- Dimensionality reduction with subgaussian matrices: a unified theory
- Rejoinder: ``A significance test for the lasso
- The residual method for regularizing ill-posed problems
- Derandomizing restricted isometries via the Legendre symbol
- An efficient algorithm for \(\ell_{0}\) minimization in wavelet frame based image restoration
- Proximity algorithms for the L1/TV image denoising model
- Primal and dual alternating direction algorithms for \(\ell _{1}\)-\(\ell _{1}\)-norm minimization problems in compressive sensing
- Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit
- A unified approach to model selection and sparse recovery using regularized least squares
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- An implementable proximal point algorithmic framework for nuclear norm minimization
- Restricted normal cones and sparsity optimization with affine constraints
- Data-driven time-frequency analysis
- A survey of compressed sensing
- Compressed sensing
- Regularized vector field learning with sparse approximation for mismatch removal
- Compressed sensing with preconditioning for sparse recovery with subsampled matrices of Slepian prolate functions
- High-dimensional inference in misspecified linear models
- On the null space property of \(l_q\)-minimization for \(0 < q \leq 1\) in compressed sensing
- A Barzilai-Borwein type method for minimizing composite functions
- Generalized sampling and infinite-dimensional compressed sensing
- New analysis of manifold embeddings and signal recovery from compressive measurements
- Generalized Kalman smoothing: modeling and algorithms
- Fast and RIP-optimal transforms
- Compressed history matching: Exploiting transform-domain sparsity for regularization of nonlinear dynamic data integration problems
- Yang-Baxter equations in quantum information
- Discussion: ``A significance test for the lasso
- Discussion: ``A significance test for the lasso
This page was built for publication: Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3548002)