Simple bounds for recovering low-complexity models
From MaRDI portal
Abstract: This note presents a unified analysis of the recovery of simple objects from random linear measurements. When the linear functionals are Gaussian, we show that an s-sparse vector in R^n can be efficiently recovered from 2s log n measurements with high probability and a rank r, n by n matrix can be efficiently recovered from r(6n-5r) with high probability. For sparse vectors, this is within an additive factor of the best known nonasymptotic bounds. For low-rank matrices, this matches the best known bounds. We present a parallel analysis for block sparse vectors obtaining similarly tight bounds. In the case of sparse and block sparse signals, we additionally demonstrate that our bounds are only slightly weakened when the measurement map is a random sign matrix. Our results are based on analyzing a particular dual point which certifies optimality conditions of the respective convex programming problem. Our calculations rely only on standard large deviation inequalities and our analysis is self-contained.
Recommendations
Cites work
- scientific article; zbMATH DE number 4061904 (Why is no real title available?)
- scientific article; zbMATH DE number 3673370 (Why is no real title available?)
- A unified framework for high-dimensional analysis of \(M\)-estimators with decomposable regularizers
- Adaptive estimation of a quadratic functional by model selection.
- Compressed sensing
- Counting faces of randomly projected polytopes when the projection radically lowers dimension
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Exact matrix completion via convex optimization
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Invertibility of symmetric random matrices
- Local operator theory, random matrices and Banach spaces.
- On Sparse Representations in Arbitrary Redundant Bases
- Probability Inequalities for Sums of Bounded Random Variables
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- The convex geometry of linear inverse problems
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
Cited in
(36)- Sparse recovery under weak moment assumptions
- Typical \(l_1\)-recovery limit of sparse vectors represented by concatenations of random orthogonal matrices
- Sparse recovery with pre-Gaussian random matrices
- Lower bounds for adaptive sparse recovery
- Sparse signal recovery from quadratic measurements via convex programming
- Nonuniform sparse recovery with subgaussian matrices
- Super-resolution of positive sources on an arbitrarily fine grid
- A Unified Recovery of Structured Signals Using Atomic Norm
- Sparse power factorization: balancing peakiness and sample complexity
- Lower bounds for sparse recovery
- scientific article; zbMATH DE number 7306914 (Why is no real title available?)
- Sparse disjointed recovery from noninflating measurements
- Error bounds for sparse and low-rank matrix approximation based on restricted isometric properties
- A perturbation inequality for concave functions of singular values and its applications in low-rank matrix recovery
- On two random models in data analysis
- Sparse recovery from extreme eigenvalues deviation inequalities
- Improved bounds for sparse recovery from subsampled random convolutions
- scientific article; zbMATH DE number 7306926 (Why is no real title available?)
- Model selection with low complexity priors
- Non-convex matrix completion and related problems via strong duality
- Decomposable norm minimization with proximal-gradient homotopy algorithm
- scientific article; zbMATH DE number 7583424 (Why is no real title available?)
- A theory of optimal convex regularization for low-dimensional recovery
- On sparse reconstruction from Fourier and Gaussian measurements
- On uniqueness guarantees of solution in convex regularized linear inverse problems
- Sharp MSE bounds for proximal denoising
- Low complexity regularization of linear inverse problems
- On the solution uniqueness characterization in the L1 norm and polyhedral gauge recovery
- Computational complexity versus statistical performance on sparse recovery problems
- Living on the edge: phase transitions in convex programs with random data
- On model selection consistency of regularized M-estimators
- New bounds for RIC in compressed sensing
- Necessary and sufficient conditions of solution uniqueness in 1-norm minimization
- High-dimensional dynamic systems identification with additional constraints
- Convex cardinal shape composition
- A note on the sample complexity of the Er-SpUD algorithm by Spielman, Wang and Wright for exact recovery of sparsely used dictionaries
This page was built for publication: Simple bounds for recovering low-complexity models
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q378116)