Greed is Good: Algorithmic Results for Sparse Approximation
From MaRDI portal
Publication:3547716
DOI10.1109/TIT.2004.834793zbMATH Open1288.94019WikidataQ59750826 ScholiaQ59750826MaRDI QIDQ3547716FDOQ3547716
Authors: Joel A. Tropp
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Recommendations
- Signal Recovery From Random Measurements Via Orthogonal Matching Pursuit
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Greedy approximation
- The exact recovery of sparse signals via orthogonal matching pursuit
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
Applications of mathematical programming (90C90) Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Approximation by other special function classes (41A30)
Cited In (only showing first 100 items - show all)
- Multi-layer sparse coding: the holistic way
- Correlations of random classifiers on large data sets
- Search for sparse solutions of super-large systems with a tensor structure
- Angular scattering function estimation using deep neural networks
- Hierarchical compressed sensing
- Model selection for high-dimensional linear regression with dependent observations
- Robust and resource-efficient identification of two hidden layer neural networks
- Title not available (Why is that?)
- Limited-complexity controller tuning: a set membership data-driven approach
- An automatic and parameter-free information-based method for sparse representation in wavelet bases
- A performance guarantee for orthogonal matching pursuit using mutual coherence
- Running time analysis of the (1+1)-EA for robust linear optimization
- Analysis of the self projected matching pursuit algorithm
- On collaborative compressive sensing systems: the framework, design, and algorithm
- Greedy expansions in Hilbert spaces
- Boosting with structural sparsity: a differential inclusion approach
- Optimized projections for compressed sensing via rank-constrained nearest correlation matrix
- A generalized class of hard thresholding algorithms for sparse signal recovery
- A component Lasso
- Sparse set membership identification of nonlinear functions and application to fault detection
- A look at robustness and stability of \(\ell_1\)-versus \(\ell_0\)-regularization: discussion of papers by Bertsimas et al. and Hastie et al.
- Alternating direction method of multipliers for solving dictionary learning models
- Greedy orthogonal matching pursuit for subspace clustering to improve graph connectivity
- A modified orthogonal matching pursuit for construction of sparse probabilistic Boolean networks
- Stochastic greedy algorithms for multiple measurement vectors
- On the Probabilistic Cauchy Theory for Nonlinear Dispersive PDEs
- Stable restoration and separation of approximately sparse signals
- A smoothing inertial neural network for sparse signal reconstruction with noise measurements via \(L_p-L_1\) minimization
- Smoothing inertial neurodynamic approach for sparse signal reconstruction via \(L_p\)-norm minimization
- Beyond coherence: Recovering structured time-frequency representations
- Constructing New Weighted ℓ1-Algorithms for the Sparsest Points of Polyhedral Sets
- The finite steps of convergence of the fast thresholding algorithms with \(f\)-feedbacks in compressed sensing
- A multiple measurement vector approach to synthetic aperture radar imaging
- Generalized sparse recovery model and its neural dynamical optimization method for compressed sensing
- An efficient algorithm for learning dictionary under coherence constraint
- An efficient algorithm for overcomplete sparsifying transform learning with signal denoising
- Incoherent dictionary learning method based on unit norm tight frame and manifold optimization for sparse representation
- Improving the incoherence of a learned dictionary via rank shrinkage
- Evaluating visual properties via robust HodgeRank
- Approximate submodularity and its applications: subset selection, sparse approximation and dictionary selection
- A fast algorithm for learning overcomplete dictionary for sparse representation based on proximal operators
- Adaptive multi-penalty regularization based on a generalized Lasso path
- Image deblurring with coupled dictionary learning
- A survey on compressive sensing: classical results and recent advancements
- Efficiency of orthogonal super greedy algorithm under the restricted isometry property
- Sparse signal recovery via ECME thresholding pursuits
- Super-resolution for doubly-dispersive channel estimation
- Iterative positive thresholding algorithm for non-negative sparse optimization
- A review on restoration of seismic wavefields based on regularization and compressive sensing
- A smoothing method for sparse optimization over convex sets
- A forward-backward greedy approach for sparse multiscale learning
- Subspace learning by \(\ell^0\)-induced sparsity
- Feature selection by canonical correlation search in high-dimensional multiresponse models with complex group structures
- Weak stability of \(\ell_1\)-minimization methods in sparse data reconstruction
- Sparse Bayesian imaging of solar flares
- Compressive Sensing
- Orthogonal one step greedy procedure for heteroscedastic linear models
- Nomonotone spectral gradient method for sparse recovery
- Locally sparse reconstruction using the \(\ell^{1,\infty}\)-norm
- A tensor decomposition based multiway structured sparse SAR imaging algorithm with Kronecker constraint
- Variational texture synthesis with sparsity and spectrum constraints
- \(\mathrm{L_1RIP}\)-based robust compressed sensing
- On the uniqueness of overcomplete dictionaries, and a practical way to retrieve them
- Approximation accuracy, gradient methods, and error bound for structured convex optimization
- Sparse representations and approximation theory
- Some greedy algorithms for sparse polynomial chaos expansions
- Iterative hard thresholding methods for \(l_0\) regularized convex cone programming
- Simultaneous image fusion and demosaicing via compressive sensing
- Nonnegative adaptive Lasso for ultra-high dimensional regression models and a two-stage method applied in financial modeling
- A note on the complexity of \(L _{p }\) minimization
- Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization
- A primal dual active set with continuation algorithm for the \(\ell^0\)-regularized optimization problem
- On Lebesgue-type inequalities for greedy approximation
- Iterative thresholding for sparse approximations
- Ways to sparse representation: An overview
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Algorithms for simultaneous sparse approximation. II: Convex relaxation
- Convolutional neural networks analyzed via convolutional sparse coding
- Phase transitions for greedy sparse approximation algorithms
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- On the existence of equiangular tight frames
- Spectral compressive sensing
- Theory and applications of compressed sensing
- Analysis of orthogonal multi-matching pursuit under restricted isometry property
- Robustness of orthogonal matching pursuit under restricted isometry property
- Algorithms for simultaneous sparse approximation. I: Greedy pursuit
- Smooth sparse coding via marginal regression for learning sparse representations
- Regularization techniques and suboptimal solutions to optimization problems in learning from data
- Proximity algorithms for the L1/TV image denoising model
- Two are better than one: fundamental parameters of frame coherence
- Compressed sensing and dynamic mode decomposition
- A class of deterministic sensing matrices and their application in harmonic detection
- Compressed sensing with coherent and redundant dictionaries
- Deterministic convolutional compressed sensing matrices
- Restricted isometries for partial random circulant matrices
- Microlocal analysis of the geometric separation problem
- A continuous exact \(\ell_0\) penalty (CEL0) for least squares regularized problem
- Optimal non-linear models for sparsity and sampling
- Learning circulant sensing kernels
- Fast and scalable Lasso via stochastic Frank-Wolfe methods with a convergence guarantee
This page was built for publication: Greed is Good: Algorithmic Results for Sparse Approximation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547716)