Signal recovery under cumulative coherence
From MaRDI portal
Publication:1624658
DOI10.1016/j.cam.2018.07.019zbMath1405.94025arXiv1806.10790OpenAlexW2810513605MaRDI QIDQ1624658
Publication date: 16 November 2018
Published in: Journal of Computational and Applied Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1806.10790
Dantzig selectororacle inequalityLassorestricted eigenvalue conditioncumulative coherencecloseness of prediction loss
Nonparametric estimation (62G05) Signal theory (characterization, reconstruction, filtering, etc.) (94A12)
Related Items
Group sparse recovery via group square-root elastic net and the iterative multivariate thresholding-based algorithm ⋮ Robust recovery of signals with partially known support information using weighted BPDN ⋮ Signal recovery under mutual incoherence property and oracle inequalities ⋮ Low-rank matrix recovery via regularized nuclear norm minimization ⋮ An improved bound of cumulative coherence for signal recovery ⋮ Robust sparse recovery via a novel convex model
Uses Software
Cites Work
- Unnamed Item
- A mathematical introduction to compressive sensing
- On greedy algorithms for dictionaries with bounded cumulative coherence
- A remark on the Lasso and the Dantzig selector
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO
- Stable recovery of analysis based approaches
- One condition for solution uniqueness and robustness of both \(\ell_1\)-synthesis and \(\ell_1\)-analysis minimizations
- Simultaneous analysis of Lasso and Dantzig selector
- Deterministic sampling of sparse trigonometric polynomials
- Stability and robustness of \(\ell_1\)-minimizations with Weibull matrices and redundant dictionaries
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Improved Iteratively Reweighted Least Squares for Unconstrained Smoothed $\ell_q$ Minimization
- Analysis Recovery With Coherent Frames and Correlated Measurements
- Compressed Sensing Matrices From Fourier Matrices
- Compressed sensing and best 𝑘-term approximation
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Decoding by Linear Programming
- Stable recovery of sparse overcomplete representations in the presence of noise
- Greed is Good: Algorithmic Results for Sparse Approximation
- Just relax: convex programming methods for identifying sparse signals in noise
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Atomic Decomposition by Basis Pursuit
- Ideal spatial adaptation by wavelet shrinkage
- Uncertainty principles and ideal atomic decomposition
- A Proof of Conjecture on Restricted Isometry Property Constants $\delta _{tk}\ \left(0<t<\frac {4}{3}\right)$
- Dictionary Preconditioning for Greedy Algorithms
- Sparse Approximation Property and Stable Recovery of Sparse Signals From Noisy Measurements
- Fast and Efficient Compressive Sensing Using Structurally Random Matrices
- New Bounds for Restricted Isometry Constants With Coherent Tight Frames
- Compressed Sensing and Affine Rank Minimization Under Restricted Isometry
- Smoothing and Decomposition for Analysis Sparse Recovery
- Further Results on Stable Recovery of Sparse Overcomplete Representations in the Presence of Noise
- On Recovery of Sparse Signals Via $\ell _{1}$ Minimization
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- New Bounds for Restricted Isometry Constants
- Stable Recovery of Sparse Signals and an Oracle Inequality
- Analysis versus synthesis in signal priors
- Sparse Representation of a Polytope and Recovery of Sparse Signals and Low-Rank Matrices
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Stable signal recovery from incomplete and inaccurate measurements
- Compressed sensing