Greed is Good: Algorithmic Results for Sparse Approximation
From MaRDI portal
Publication:3547716
DOI10.1109/TIT.2004.834793zbMATH Open1288.94019WikidataQ59750826 ScholiaQ59750826MaRDI QIDQ3547716FDOQ3547716
Authors: Joel A. Tropp
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Recommendations
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Greedy approximation
- The exact recovery of sparse signals via orthogonal matching pursuit
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
Applications of mathematical programming (90C90) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Approximation by other special function classes (41A30)
Cited In (only showing first 100 items - show all)
- Compressive Sensing
- Orthogonal one step greedy procedure for heteroscedastic linear models
- Nomonotone spectral gradient method for sparse recovery
- Locally sparse reconstruction using the \(\ell^{1,\infty}\)-norm
- A tensor decomposition based multiway structured sparse SAR imaging algorithm with Kronecker constraint
- Variational texture synthesis with sparsity and spectrum constraints
- \(\mathrm{L_1RIP}\)-based robust compressed sensing
- On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Sparse representations and approximation theory
- Some greedy algorithms for sparse polynomial chaos expansions
- Iterative hard thresholding methods for \(l_0\) regularized convex cone programming
- Simultaneous image fusion and demosaicing via compressive sensing
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- A note on the complexity of \(L _{p }\) minimization
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- On Lebesgue-type inequalities for greedy approximation
- Iterative thresholding for sparse approximations
- Ways to sparse representation: An overview
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Algorithms for simultaneous sparse approximation. II: Convex relaxation
- Convolutional neural networks analyzed via convolutional sparse coding
- Phase transitions for greedy sparse approximation algorithms
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- On the existence of equiangular tight frames
- Spectral compressive sensing
- Theory and applications of compressed sensing
- Analysis of orthogonal multi-matching pursuit under restricted isometry property
- Robustness of orthogonal matching pursuit under restricted isometry property
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Smooth sparse coding via marginal regression for learning sparse representations
- Regularization techniques and suboptimal solutions to optimization problems in learning from data
- Proximity algorithms for the L1/TV image denoising model
- Two are better than one: fundamental parameters of frame coherence
- Compressed sensing and dynamic mode decomposition
- A class of deterministic sensing matrices and their application in harmonic detection
- Compressed sensing with coherent and redundant dictionaries
- Deterministic convolutional compressed sensing matrices
- Restricted isometries for partial random circulant matrices
- Microlocal analysis of the geometric separation problem
- A continuous exact \(\ell_0\) penalty (CEL0) for least squares regularized problem
- Optimal non-linear models for sparsity and sampling
- Learning circulant sensing kernels
- Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee
- Point source super-resolution via non-convex \(L_1\) based methods
- Computing sparse representation in a highly coherent dictionary based on difference of \(L_1\) and \(L_2\)
- Revisiting compressed sensing: exploiting the efficiency of simplex and sparsification methods
- Sparse approximation is provably hard under coherent dictionaries
- Data-driven tight frame construction and image denoising
- Stability Selection
- Direct sparse deblurring
- 1-bit compressive sensing: reformulation and RRSP-based sign recovery theory
- Robust classifier using distance-based representation with square weights
- On the optimality of the orthogonal greedy algorithm for \(\mu\)-coherent dictionaries
- Random sampling of sparse trigonometric polynomials. II: Orthogonal matching pursuit versus basis pursuit
- Equivalence and strong equivalence between the sparsest and least \(\ell _1\)-norm nonnegative solutions of linear systems and their applications
- A note on sparse least-squares regression
- Quasi-linear compressed sensing
- A survey of compressed sensing
- K-hyperline clustering learning for sparse component analysis
- On performance of greedy algorithms
- KFCE: a dictionary generation algorithm for sparse representation
- The geometry of off-the-grid compressed sensing
- Accelerated iterative hard thresholding algorithm for \(l_0\) regularized regression problem
- Sparse microwave imaging: principles and applications
- Statistical significance in high-dimensional linear models
- Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization
- Matrix recipes for hard thresholding methods
- Autoregressive process modeling via the Lasso procedure
- Exact optimization for the \(\ell ^{1}\)-compressive sensing problem using a modified Dantzig-Wolfe method
- On greedy algorithms for dictionaries with bounded cumulative coherence
- Lasso-type recovery of sparse representations for high-dimensional data
- Structured sparsity through convex optimization
- High-dimensional variable selection
- Sparse approximation and recovery by greedy algorithms in Banach spaces
- Model decomposition and reduction tools for large-scale networks in systems biology
- Sparsity in time-frequency representations
- A new computational method for the sparsest solutions to systems of linear equations
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- ``Grouping strategies and thresholding for high dimensional linear models: discussion
- Beyond sparsity: recovering structured representations by \({\ell}^1\) minimization and greedy algorithms
- PROMP: a sparse recovery approach to lattice-valued signals
- Surveying and comparing simultaneous sparse approximation (or group-lasso) algorithms
- Matrix-wise \(\ell_0\)-constrained sparse nonnegative least squares
- Frame permutation quantization
- Error estimates for orthogonal matching pursuit and random dictionaries
- A simple test to check the optimality of a sparse signal approximation
- Optimization over finite frame varieties and structured dictionary design
- Sparse conjugate directions pursuit with application to fixed-size kernel models
- Geometric separation by single-pass alternating thresholding
- Stability and robustness of weak orthogonal matching pursuits
- Directional Haar wavelet frames on triangles
- Nonconvex sorted \(\ell_1\) minimization for sparse approximation
- One condition for solution uniqueness and robustness of both \(\ell_1\)-synthesis and \(\ell_1\)-analysis minimizations
- Independent Component Analysis and Blind Signal Separation
- Highly sparse representations from dictionaries are unique and independent of the sparseness measure
- Atoms of all channels, unite! Average case analysis of multi-channel sparse recovery using greedy algorithms
- Enhancing sparsity of Hermite polynomial expansions by iterative rotations
- Source localization using a sparse representation framework to achieve superresolution
This page was built for publication: Greed is Good: Algorithmic Results for Sparse Approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547716)