Sparse recovery under weak moment assumptions
From MaRDI portal
Publication:520739
DOI10.4171/JEMS/682zbMATH Open1414.62135arXiv1401.2188WikidataQ105584474 ScholiaQ105584474MaRDI QIDQ520739FDOQ520739
Authors: Guillaume Lecué, Shahar Mendelson
Publication date: 5 April 2017
Published in: Journal of the European Mathematical Society (JEMS) (Search for Journal in Brave)
Abstract: We prove that iid random vectors that satisfy a rather weak moment assumption can be used as measurement vectors in Compressed Sensing, and the number of measurements required for exact reconstruction is the same as the best possible estimate -- exhibited by a random gaussian matrix. We also prove that this moment condition is necessary, up to a factor. Applications to the Compatibility Condition and the Restricted Eigenvalue Condition in the noisy setup and to properties of neighbourly random polytopes are also discussed.
Full work available at URL: https://arxiv.org/abs/1401.2188
Recommendations
- Nonuniform sparse recovery with subgaussian matrices
- Improved bounds for sparse recovery from subsampled random convolutions
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Simple bounds for recovering low-complexity models
- The restricted isometry property and its implications for compressed sensing
Cites Work
- Weak convergence and empirical processes. With applications to statistics
- Title not available (Why is that?)
- Statistics for high-dimensional data. Methods, theory and applications.
- Simultaneous analysis of Lasso and Dantzig selector
- The Dantzig selector: statistical estimation when \(p\) is much larger than \(n\). (With discussions and rejoinder).
- Restricted eigenvalue properties for correlated Gaussian designs
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Title not available (Why is that?)
- Concentration inequalities. A nonasymptotic theory of independence
- Optimally sparse representation in general (nonorthogonal) dictionaries via ℓ 1 minimization
- Gaussian model selection
- Stable signal recovery from incomplete and inaccurate measurements
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Reconstruction From Anisotropic Random Measurements
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Compressed sensing
- Concentration inequalities and model selection. Ecole d'Eté de Probabilités de Saint-Flour XXXIII -- 2003.
- A mathematical introduction to compressive sensing
- Uncertainty principles and ideal atomic decomposition
- Cube Slicing in R n
- Sparse Approximate Solutions to Linear Systems
- Stability and robustness of \(\ell_1\)-minimizations with Weibull matrices and redundant dictionaries
- On sparse reconstruction from Fourier and Gaussian measurements
- Weakly decomposable regularization penalties and structured sparsity
- The restricted isometry property and its implications for compressed sensing
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
- Central limit theorems for empirical measures
- Atomic decomposition by basis pursuit
- Learning without concentration
- Bounding the smallest singular value of a random matrix without concentration
- Covariance estimation for distributions with \({2+\varepsilon}\) moments
- On higher order isotropy conditions and lower bounds for sparse quadratic forms
- A remark on the diameter of random sections of convex bodies
- Small ball probabilities for linear images of high-dimensional distributions
- Estimation of moments of sums of independent real random variables
- The lower tail of random quadratic forms with applications to ordinary least squares
- Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling
- Linear Inversion of Band-Limited Reflection Seismograms
- Title not available (Why is that?)
- Signal Recovery and the Large Sieve
- On tight bounds for the Lasso
Cited In (41)
- Noise covariance estimation in multi-task high-dimensional linear models
- Dimensionality reduction with subgaussian matrices: a unified theory
- The gap between the null space property and the restricted isometry property
- Reducing effects of bad data using variance based joint sparsity recovery
- Dictionary-sparse recovery from heavy-tailed measurements
- RIPless compressed sensing from anisotropic measurements
- On the interval of fluctuation of the singular values of random matrices
- Non-Gaussian hyperplane tessellations and robust one-bit compressed sensing
- Nonuniform sparse recovery with subgaussian matrices
- Performance of empirical risk minimization in linear aggregation
- Regularization and the small-ball method. I: Sparse recovery
- Controlling the least eigenvalue of a random Gram matrix
- Sample average approximation with heavier tails II: localization in stochastic convex optimization and persistence results for the Lasso
- Sparse recovery in probability via \(l_q\)-minimization with Weibull random matrices for \(0 < q\leq 1\)
- Slope meets Lasso: improved oracle bounds and optimality
- Convergence rates of least squares regression estimators with heavy-tailed errors
- Column normalization of a random measurement matrix
- Phase retrieval with PhaseLift algorithm
- Regularization and the small-ball method. II: Complexity dependent error rates
- Stable recovery and the coordinate small-ball behaviour of random vectors
- Sparse recovery from extreme eigenvalues deviation inequalities
- Conjugate gradient acceleration of iteratively re-weighted least squares methods
- Sparse Recovery With Unknown Variance: A LASSO-Type Approach
- On multiplier processes under weak moment assumptions
- Sparse disjointed recovery from noninflating measurements
- On the geometry of polytopes generated by heavy-tailed random vectors
- Regularization, sparse recovery, and median-of-means tournaments
- Generic error bounds for the generalized Lasso with sub-exponential data
- Estimation of the \(\ell_2\)-norm and testing in sparse linear regression with unknown variance
- Preserving injectivity under subgaussian mappings and its application to compressed sensing
- Low rank matrix recovery from rank one measurements
- Robust sparse recovery with sparse Bernoulli matrices via expanders
- Learning without concentration
- Improved bounds for sparse recovery from subsampled random convolutions
- A Rice method proof of the null-space property over the Grassmannian
- Maximin effects in inhomogeneous large-scale data
- Estimation in high dimensions: a geometric perspective
- Flavors of compressive sensing
- Covariate-adaptive randomization with variable selection in clinical trials
- Learning from MOM's principles: Le Cam's approach
- Sure independence screening and compressed random sensing
This page was built for publication: Sparse recovery under weak moment assumptions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q520739)