Greed is Good: Algorithmic Results for Sparse Approximation
From MaRDI portal
Publication:3547716
Recommendations
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Greedy approximation
- The exact recovery of sparse signals via orthogonal matching pursuit
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
Cited in
(only showing first 100 items - show all)- Theory and applications of compressed sensing
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- Sparse approximation and recovery by greedy algorithms in Banach spaces
- Compressive Sensing
- Variational texture synthesis with sparsity and spectrum constraints
- Computing sparse representation in a highly coherent dictionary based on difference of \(L_1\) and \(L_2\)
- Simultaneous image fusion and demosaicing via compressive sensing
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- Stability Selection
- On Lebesgue-type inequalities for greedy approximation
- 1-bit compressive sensing: reformulation and RRSP-based sign recovery theory
- Direct sparse deblurring
- Robust classifier using distance-based representation with square weights
- A new computational method for the sparsest solutions to systems of linear equations
- Equivalence and strong equivalence between the sparsest and least \(\ell _1\)-norm nonnegative solutions of linear systems and their applications
- Autoregressive process modeling via the Lasso procedure
- The geometry of off-the-grid compressed sensing
- Analysis of orthogonal multi-matching pursuit under restricted isometry property
- Robustness of orthogonal matching pursuit under restricted isometry property
- Sparse representations and approximation theory
- On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Accelerated iterative hard thresholding algorithm for \(l_0\) regularized regression problem
- Exact optimization for the \(\ell ^{1}\)-compressive sensing problem using a modified Dantzig-Wolfe method
- Some greedy algorithms for sparse polynomial chaos expansions
- On the existence of equiangular tight frames
- Iterative thresholding for sparse approximations
- Iterative hard thresholding methods for \(l_0\) regularized convex cone programming
- On the optimality of the orthogonal greedy algorithm for \(\mu\)-coherent dictionaries
- Lasso-type recovery of sparse representations for high-dimensional data
- Restricted isometries for partial random circulant matrices
- Convolutional neural networks analyzed via convolutional sparse coding
- On greedy algorithms for dictionaries with bounded cumulative coherence
- K-hyperline clustering learning for sparse component analysis
- Sparse microwave imaging: principles and applications
- Compressed sensing with coherent and redundant dictionaries
- Statistical significance in high-dimensional linear models
- Recovery of high-dimensional sparse signals via \(\ell_1\)-minimization
- On performance of greedy algorithms
- Spectral compressive sensing
- Microlocal analysis of the geometric separation problem
- Matrix recipes for hard thresholding methods
- Two are better than one: fundamental parameters of frame coherence
- Random sampling of sparse trigonometric polynomials. II: Orthogonal matching pursuit versus basis pursuit
- Model decomposition and reduction tools for large-scale networks in systems biology
- Structured sparsity through convex optimization
- Orthogonal one step greedy procedure for heteroscedastic linear models
- Nomonotone spectral gradient method for sparse recovery
- Locally sparse reconstruction using the \(\ell^{1,\infty}\)-norm
- A note on sparse least-squares regression
- Smooth sparse coding via marginal regression for learning sparse representations
- High-dimensional variable selection
- KFCE: a dictionary generation algorithm for sparse representation
- A continuous exact \(\ell_0\) penalty (CEL0) for least squares regularized problem
- A note on the complexity of \(L _{p }\) minimization
- Ways to sparse representation: An overview
- Compressed sensing and dynamic mode decomposition
- Phase transitions for greedy sparse approximation algorithms
- A class of deterministic sensing matrices and their application in harmonic detection
- Sparsity in time-frequency representations
- Optimal non-linear models for sparsity and sampling
- Deterministic convolutional compressed sensing matrices
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee
- Point source super-resolution via non-convex \(L_1\) based methods
- Asymptotic properties of Lasso+mLS and Lasso+Ridge in sparse high-dimensional linear regression
- Revisiting compressed sensing: exploiting the efficiency of simplex and sparsification methods
- Sparse approximation is provably hard under coherent dictionaries
- A tensor decomposition based multiway structured sparse SAR imaging algorithm with Kronecker constraint
- Regularization techniques and suboptimal solutions to optimization problems in learning from data
- \(\mathrm{L_1RIP}\)-based robust compressed sensing
- Algorithms for simultaneous sparse approximation. II: Convex relaxation
- Proximity algorithms for the L1/TV image denoising model
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Data-driven tight frame construction and image denoising
- Learning circulant sensing kernels
- A survey of compressed sensing
- Quasi-linear compressed sensing
- Image deblurring with coupled dictionary learning
- Sparse signal recovery via ECME thresholding pursuits
- Optimized projections for compressed sensing via rank-constrained nearest correlation matrix
- A modified orthogonal matching pursuit for construction of sparse probabilistic Boolean networks
- A generalized class of hard thresholding algorithms for sparse signal recovery
- A fast algorithm for learning overcomplete dictionary for sparse representation based on proximal operators
- A component Lasso
- A performance guarantee for orthogonal matching pursuit using mutual coherence
- Running time analysis of the (1+1)-EA for robust linear optimization
- Analysis of the self projected matching pursuit algorithm
- A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.
- Multi-layer sparse coding: the holistic way
- A smoothing inertial neural network for sparse signal reconstruction with noise measurements via \(L_p-L_1\) minimization
- On collaborative compressive sensing systems: the framework, design, and algorithm
- Evaluating visual properties via robust HodgeRank
- Beyond coherence: Recovering structured time-frequency representations
- Subspace learning by \(\ell^0\)-induced sparsity
- A survey on compressive sensing: classical results and recent advancements
- Stochastic greedy algorithms for multiple measurement vectors
- Sparse set membership identification of nonlinear functions and application to fault detection
This page was built for publication: Greed is Good: Algorithmic Results for Sparse Approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547716)