Greedy-like algorithms for the cosparse analysis model
From MaRDI portal
Publication:2437331
Abstract: The cosparse analysis model has been introduced recently as an interesting alternative to the standard sparse synthesis approach. A prominent question brought up by this new construction is the analysis pursuit problem -- the need to find a signal belonging to this model, given a set of corrupted measurements of it. Several pursuit methods have already been proposed based on relaxation and a greedy approach. In this work we pursue this question further, and propose a new family of pursuit algorithms for the cosparse analysis model, mimicking the greedy-like methods -- compressive sampling matching pursuit (CoSaMP), subspace pursuit (SP), iterative hard thresholding (IHT) and hard thresholding pursuit (HTP). Assuming the availability of a near optimal projection scheme that finds the nearest cosparse subspace to any vector, we provide performance guarantees for these algorithms. Our theoretical study relies on a restricted isometry property adapted to the context of the cosparse analysis model. We explore empirically the performance of these algorithms by adopting a plain thresholding projection, demonstrating their good performance.
Recommendations
- A modified greedy analysis pursuit algorithm for the cosparse analysis model
- The cosparse analysis model and algorithms
- Phase transitions for greedy sparse approximation algorithms
- Near oracle performance and block analysis of signal space greedy methods
- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
Cites work
- A Theory for Sampling Signals From a Union of Subspaces
- A simple proof of the restricted isometry property for random matrices
- Adaptive greedy approximations
- Analysis K-SVD: A Dictionary-Learning Algorithm for the Analysis Sparse Model
- Analysis versus synthesis in signal priors
- CoSaMP: Iterative signal recovery from incomplete and inaccurate samples
- Compressed Sensing and Redundant Dictionaries
- Compressed sensing with coherent and redundant dictionaries
- Counting faces of randomly projected polytopes when the projection radically lowers dimension
- Hard thresholding pursuit: an algorithm for compressive sensing
- Iterative hard thresholding for compressed sensing
- Iteratively reweighted least squares minimization for sparse recovery
- Matching pursuits with time-frequency dictionaries
- Matrix recipes for hard thresholding methods
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
- New bounds on the restricted isometry constant \(\delta _{2k}\)
- Non-asymptotic theory of random matrices: extreme singular values
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Performance Guarantees of the Thresholding Algorithm for the Cosparse Analysis Model
- RIP-Based Near-Oracle Performance Guarantees for SP, CoSaMP, and IHT
- Robust Sparse Analysis Regularization
- Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces
- Sparse Recovery With Orthogonal Matching Pursuit Under RIP
- Sparse recovery algorithms: sufficient conditions in terms of restricted isometry constants
- Sparse representations in unions of bases
- Sparsity and Smoothness Via the Fused Lasso
- Subspace Pursuit for Compressive Sensing Signal Reconstruction
- The cosparse analysis model and algorithms
- The restricted isometry property and its implications for compressed sensing
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
Cited in
(25)- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
- Compressive sensing with redundant dictionaries and structured measurements
- Sampling rates for \(\ell^1\)-synthesis
- Multi-layer sparse coding: the holistic way
- Sparsity based methods for overparameterized variational problems
- Cosparsity in Compressed Sensing
- Sampling in the analysis transform domain
- Analysis sparse representation for nonnegative signals based on determinant measure by DC programming
- Stochastic greedy algorithms for multiple measurement vectors
- Compressive sensing in acoustic imaging
- Dictionary-sparse recovery via thresholding-based algorithms
- Convergence analysis on the alternating direction method of multipliers for the cosparse optimization problem
- Structure dependent sampling in compressed sensing: theoretical guarantees for tight frames
- Greedy signal space methods for incoherence and beyond
- Near oracle performance and block analysis of signal space greedy methods
- Two new lower bounds for the spark of a matrix
- A modified greedy analysis pursuit algorithm for the cosparse analysis model
- Image reconstruction using analysis model prior
- Structured overcomplete sparsifying transform learning with convergence guarantees and applications
- Generalizing CoSaMP to signals from a union of low dimensional linear subspaces
- Dimensionality reduction with subgaussian matrices: a unified theory
- Analysis \(\ell_1\)-recovery with frames and Gaussian measurements
- Robust analysis ℓ1-recovery from Gaussian measurements and total variation minimization
- The cosparse analysis model and algorithms
- Convergence on thresholding-based algorithms for dictionary-sparse recovery
This page was built for publication: Greedy-like algorithms for the cosparse analysis model
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2437331)