Phase transitions for greedy sparse approximation algorithms
From MaRDI portal
Abstract: A major enterprise in compressed sensing and sparse approximation is the design and analysis of computationally tractable algorithms for recovering sparse, exact or approximate, solutions of underdetermined linear systems of equations. Many such algorithms have now been proven to have optimal-order uniform recovery guarantees using the ubiquitous Restricted Isometry Property (RIP). However, it is unclear when the RIP-based sufficient conditions on the algorithm are satisfied. We present a framework in which this task can be achieved; translating these conditions for Gaussian measurement matrices into requirements on the signal's sparsity level, length, and number of measurements. We illustrate this approach on three of the state-of-the-art greedy algorithms: CoSaMP, Subspace Pursuit (SP), and Iterative Hard Thresholding (IHT). Designed to allow a direct comparison of existing theory, our framework implies that, according to the best known bounds, IHT requires the fewest number of compressed sensing measurements and has the lowest per iteration computational cost of the three algorithms compared here.
Recommendations
- Sparse approximation by greedy algorithms
- Critical behavior and universality classes for an algorithmic phase transition in sparse reconstruction
- Sparse Approximation and Recovery by Greedy Algorithms
- Greed is Good: Algorithmic Results for Sparse Approximation
- Approximation Bounds for Sparse Programs
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Phase transitions in semidefinite relaxations
- Optimization on sparse random hypergraphs and spin glasses
- Phase transitions in parameter rich optimization problems
- Large deviations of the greedy independent set algorithm on sparse random graphs
Cites work
- scientific article; zbMATH DE number 1818892 (Why is no real title available?)
- scientific article; zbMATH DE number 5060482 (Why is no real title available?)
- Analysis of Orthogonal Matching Pursuit Using the Restricted Isometry Property
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Compressed sensing
- Compressed sensing: how sharp is the restricted isometry property?
- Compressive sampling
- Counting faces of randomly projected polytopes when the projection radically lowers dimension
- Decoding by Linear Programming
- Fast Solution of $\ell _{1}$-Norm Minimization Problems When the Solution May Be Sparse
- Finding the stationary states of Markov chains by iterative methods
- From Sparse Solutions of Systems of Equations to Sparse Modeling of Signals and Images
- High-dimensional centrally symmetric polytopes with neighborliness proportional to dimension
- Improved bounds on restricted isometry constants for Gaussian matrices
- Instance optimal decoding by thresholding in compressed sensing
- Introductory lectures on convex optimization. A basic course.
- Iterative hard thresholding for compressed sensing
- On sparse reconstruction from Fourier and Gaussian measurements
- On support sizes of restricted isometry constants
- Phase transitions for greedy sparse approximation algorithms
- Probing the Pareto frontier for basis pursuit solutions
- Sparse Approximate Solutions to Linear Systems
- Sparse Solution of Underdetermined Systems of Linear Equations by Stagewise Orthogonal Matching Pursuit
- Sparse nonnegative solution of underdetermined linear equations by linear programming
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Subspace Pursuit for Compressive Sensing Signal Reconstruction
- Uniform uncertainty principle and signal recovery via regularized orthogonal matching pursuit
Cited in
(19)- Compressive Sensing
- The restricted isometry property for random block diagonal matrices
- The gap between the null space property and the restricted isometry property
- On the number of iterations for convergence of CoSaMP and subspace pursuit algorithms
- Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices
- Performance comparisons of greedy algorithms in compressed sensing.
- Relationship between the optimal solutions of least squares regularized with \(\ell_{0}\)-norm and constrained by \(k\)-sparsity
- Greedy-like algorithms for the cosparse analysis model
- CGIHT: conjugate gradient iterative hard thresholding for compressed sensing and matrix completion
- Phase transitions in semidefinite relaxations
- Hard thresholding pursuit algorithms: number of iterations
- GPU accelerated greedy algorithms for compressed sensing
- Sparse recovery algorithms: sufficient conditions in terms of restricted isometry constants
- From theoretical guarantee to practical performance: selectable and optimal step-lengths for IHT and HTP algorithms in compressed sensing
- Phase transitions for greedy sparse approximation algorithms
- On support sizes of restricted isometry constants
- Iterative hard thresholding for compressed sensing
- Quasi-linear compressed sensing
- Critical behavior and universality classes for an algorithmic phase transition in sparse reconstruction
This page was built for publication: Phase transitions for greedy sparse approximation algorithms
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q629259)