One condition for solution uniqueness and robustness of both _1-synthesis and _1-analysis minimizations
From MaRDI portal
Publication:2374380
Abstract: The -synthesis model and the -analysis model recover structured signals from their undersampled measurements. The solution of former is a sparse sum of dictionary atoms, and that of the latter makes sparse correlations with dictionary atoms. This paper addresses the question: when can we trust these models to recover specific signals? We answer the question with a condition that is both necessary and sufficient to guarantee the recovery to be unique and exact and, in presence of measurement noise, to be robust. The condition is one--for--all in the sense that it applies to both of the -synthesis and -analysis models, to both of their constrained and unconstrained formulations, and to both the exact recovery and robust recovery cases. Furthermore, a convex infinity--norm program is introduced for numerically verifying the condition. A comprehensive comparison with related existing conditions are included.
Recommendations
- Necessary and sufficient conditions of solution uniqueness in 1-norm minimization
- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
- A necessary and sufficient condition for exact sparse recovery by \(\ell_1\) minimization
- Solution uniqueness of convex piecewise affine functions based optimization with applications to constrained \(\ell_1\) minimization
- Sparse phase retrieval via ℓp (0 < p ≤ 1) minimization
Cites work
- A Probabilistic and RIPless Theory of Compressed Sensing
- A mathematical introduction to compressive sensing
- Analysis _1-recovery with frames and Gaussian measurements
- Analysis versus synthesis in signal priors
- Compressed Sensing With General Frames via Optimal-Dual-Based $\ell _{1}$-Analysis
- Compressed sensing and best \(k\)-term approximation
- Compressed sensing with coherent and redundant dictionaries
- Convergence rates of convex variational regularization
- Greed is Good: Algorithmic Results for Sparse Approximation
- Linear convergence rates for Tikhonov regularization with positively homogeneous functionals
- Local behavior of sparse analysis regularization: applications to risk estimation
- Model selection with low complexity priors
- Near-Optimal Compressed Sensing Guarantees for Total Variation Minimization
- Necessary and sufficient conditions for linear convergence of \(\ell^1\)-regularization
- Necessary and sufficient conditions of solution uniqueness in 1-norm minimization
- Nonlinear total variation based noise removal algorithms
- On Sparse Representations in Arbitrary Redundant Bases
- On uniqueness guarantees of solution in convex regularized linear inverse problems
- Recovery of Exact Sparse Representations in the Presence of Bounded Noise
- Remote sensing via _1-minimization
- Robust analysis ℓ1-recovery from Gaussian measurements and total variation minimization
- Sparse nonnegative solution of underdetermined linear equations by linear programming
- Stability and robustness of \(\ell_1\)-minimizations with Weibull matrices and redundant dictionaries
- Stable Signal Reconstruction via $\ell^1$-Minimization in Redundant, Non-Tight Frames
- Stable and Robust Sampling Strategies for Compressive Imaging
- Stable image reconstruction using total variation minimization
- The Lasso problem and uniqueness
- The convex geometry of linear inverse problems
- The cosparse analysis model and algorithms
- The restricted isometry property and its implications for compressed sensing
- Theory of compressive sensing via _1-minimization: a non-RIP analysis and extensions
Cited in
(18)- Weak stability of \(\ell_1\)-minimization methods in sparse data reconstruction
- Learning Regularization Parameter-Maps for Variational Image Reconstruction Using Deep Neural Networks and Algorithm Unrolling
- Necessary and sufficient conditions of solution uniqueness in 1-norm minimization
- Signal recovery under cumulative coherence
- Cardinality minimization, constraints, and regularization: a survey
- A null space analysis of the _1-synthesis method in dictionary-based compressed sensing
- Robust recovery of signals with partially known support information using weighted BPDN
- Performance analysis for unconstrained analysis based approaches
- Analysis vs synthesis with structure -- an investigation of union of subspace models on graphs
- The homotopy method revisited: computing solution paths of \(\ell_1\)-regularized problems
- Solution uniqueness of convex piecewise affine functions based optimization with applications to constrained \(\ell_1\) minimization
- Quadratic growth conditions and uniqueness of optimal solution to Lasso
- Signal recovery under mutual incoherence property and oracle inequalities
- On uniqueness guarantees of solution in convex regularized linear inverse problems
- RIP-based performance guarantee for low-tubal-rank tensor recovery
- On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery
- The Geometry of Sparse Analysis Regularization
- Exact matrix completion based on low rank Hankel structure in the Fourier domain
This page was built for publication: One condition for solution uniqueness and robustness of both \(\ell_1\)-synthesis and \(\ell_1\)-analysis minimizations
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2374380)