Accuracy guaranties for _1 recovery of block-sparse signals
From MaRDI portal
Publication:741817
Abstract: We introduce a general framework to handle structured models (sparse and block-sparse with possibly overlapping blocks). We discuss new methods for their recovery from incomplete observation, corrupted with deterministic and stochastic noise, using block- regularization. While the current theory provides promising bounds for the recovery errors under a number of different, yet mostly hard to verify conditions, our emphasis is on verifiable conditions on the problem parameters (sensing matrix and the block structure) which guarantee accurate recovery. Verifiability of our conditions not only leads to efficiently computable bounds for the recovery error but also allows us to optimize these error bounds with respect to the method parameters, and therefore construct estimators with improved statistical properties. To justify our approach, we also provide an oracle inequality, which links the properties of the proposed recovery algorithms and the best estimation performance. Furthermore, utilizing these verifiable conditions, we develop a computationally cheap alternative to block- minimization, the non-Euclidean Block Matching Pursuit algorithm. We close by presenting a numerical study to investigate the effect of different block regularizations and demonstrate the performance of the proposed recoveries.
Recommendations
- Accuracy Guarantees for <formula formulatype="inline"> <tex Notation="TeX">$\ell_1$</tex></formula>-Recovery
- A simple Gaussian measurement bound for exact recovery of block-sparse signals
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- A note on guaranteed sparse recovery via \(\ell_1\)-minimization
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- Verifiable conditions of \(\ell_{1}\)-recovery for sparse signals with sign restrictions
- On Recovery of Sparse Signals Via $\ell _{1}$ Minimization
- Block-Sparse Recovery via Convex Optimization
- Estimation of block sparsity in compressive sensing
Cites work
- Accuracy Guarantees for <formula formulatype="inline"> <tex Notation="TeX">$\ell_1$</tex></formula>-Recovery
- Block-Sparse Signals: Uncertainty Relations and Efficient Recovery
- Consistency of the group Lasso and multiple kernel learning
- DASSO: Connections Between the Dantzig Selector and Lasso
- Decoding by Linear Programming
- Model Selection and Estimation in Regression with Grouped Variables
- Model-Based Compressive Sensing
- On low rank matrix approximations with applications to synthesis problem in compressed sensing
- On the Reconstruction of Block-Sparse Signals With an Optimal Number of Measurements
- On the asymptotic properties of the group lasso estimator for linear models
- On the conditions used to prove oracle results for the Lasso
- On verifiable sufficient conditions for sparse signal recovery via \(\ell_{1}\) minimization
- Oracle inequalities and optimal inference under group sparsity
- Robust Recovery of Signals From a Structured Union of Subspaces
- Simultaneous analysis of Lasso and Dantzig selector
- Some theoretical results on the grouped variables Lasso
- Sparse representations in unions of bases
- Stable recovery of sparse overcomplete representations in the presence of noise
- Support union recovery in high-dimensional multivariate regression
- The Computational Complexity of the Restricted Isometry Property, the Nullspace Property, and Related Concepts in Compressed Sensing
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- The Group Lasso for Logistic Regression
- The benefit of group sparsity
- The restricted isometry property and its implications for compressed sensing
- Verifiable conditions of \(\ell_{1}\)-recovery for sparse signals with sign restrictions
Cited in
(6)- On asymptotically optimal confidence regions and tests for high-dimensional models
- Verifiable conditions of \(\ell_{1}\)-recovery for sparse signals with sign restrictions
- High-order evaluation complexity for convexly-constrained optimization with non-Lipschitzian group sparsity terms
- Group Sparse Optimization for Images Recovery Using Capped Folded Concave Functions
- Recovering block-structured activations using compressive measurements
- Parameter choices for sparse regularization with the ℓ1 norm *
This page was built for publication: Accuracy guaranties for \(\ell_{1}\) recovery of block-sparse signals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q741817)