Sparse recovery from extreme eigenvalues deviation inequalities
From MaRDI portal
Publication:5228345
restricted isometry propertysparse regressionGaussian matricesdeviations inequalitiesRademacher matrices
Linear regression; mixed models (62J05) Large deviations (60F10) Ridge regression; shrinkage estimators (Lasso) (62J07) Eigenvalues, singular values, and eigenvectors (15A18) Inequalities involving eigenvalues and eigenvectors (15A42) Random matrices (probabilistic aspects) (60B20) Inequalities; stochastic orderings (60E15)
Abstract: This article provides a new toolbox to derive sparse recovery guarantees from small deviations on extreme singular values or extreme eigenvalues obtained in Random Matrix Theory. This work is based on Restricted Isometry Constants (RICs) which are a pivotal notion in Compressed Sensing and High-Dimensional Statistics as these constants finely assess how a linear operator is conditioned on the set of sparse vectors and hence how it performs in SRSR. While it is an open problem to construct deterministic matrices with apposite RICs, one can prove that such matrices exist using random matrices models. In this paper, we show upper bounds on RICs for Gaussian and Rademacher matrices using state-of-the-art small deviation estimates on their extreme eigenvalues. This allows us to derive a lower bound on the probability of getting SRSR. One benefit of this paper is a direct and explicit derivation of upper bounds on RICs and lower bounds on SRSR from small deviations on the extreme eigenvalues given by Random Matrix theory.
Recommendations
- Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices
- Simple bounds for recovering low-complexity models
- Sharp RIP bound for sparse signal and low-rank matrix recovery
- Nonuniform sparse recovery with subgaussian matrices
- Sparse recovery under weak moment assumptions
Cites work
- scientific article; zbMATH DE number 3273551 (Why is no real title available?)
- A Rice method proof of the null-space property over the Grassmannian
- A mathematical introduction to compressive sensing
- A note on universality of the distribution of the largest eigenvalues in certain sample covariance matrices
- A remark on the Lasso and the Dantzig selector
- A simple proof of the restricted isometry property for random matrices
- A simple tool for bounding the deviation of random matrices on geometric sets
- A universality result for the smallest eigenvalues of certain sample covariance matrices
- Accuracy Guarantees for <formula formulatype="inline"> <tex Notation="TeX">$\ell_1$</tex></formula>-Recovery
- Adaptive Dantzig density estimation
- Bounds of restricted isometry constants in extreme asymptotics: formulae for Gaussian matrices
- Compressed sensing: how sharp is the restricted isometry property?
- Counting faces of randomly projected polytopes when the projection radically lowers dimension
- Decoding by Linear Programming
- Deviation Inequalities on Largest Eigenvalues
- Improved bounds on restricted isometry constants for Gaussian matrices
- Increasing subsequences and the hard-to-soft edge transition in matrix ensembles
- Interactions between compressed sensing random matrices and high dimensional geometry
- Living on the edge: phase transitions in convex programs with random data
- Local operator theory, random matrices and Banach spaces.
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Neighborliness of randomly projected simplices in high dimensions
- Non-asymptotic theory of random matrices: extreme singular values
- Observed universality of phase transitions in high-dimensional geometry, with implications for modern data analysis and signal processing
- On sparse reconstruction from Fourier and Gaussian measurements
- On the conditions used to prove oracle results for the Lasso
- Random covariance matrices: universality of local statistics of eigenvalues up to the edge
- Regularization and the small-ball method. I: Sparse recovery
- Restricted isometry property of matrices with independent columns and neighborly polytopes by random sampling
- Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information
- Shape fluctuations and random matrices
- Sharp recovery bounds for convex demixing, with applications
- Simultaneous analysis of Lasso and Dantzig selector
- Small deviations for beta ensembles
- Smallest singular value of random matrices and geometry of random polytopes
- Sparse Representation of a Polytope and Recovery of Sparse Signals and Low-Rank Matrices
- Sparse recovery under weak moment assumptions
- Sparsest solutions of underdetermined linear systems via \( \ell _q\)-minimization for \(0<q\leqslant 1\)
- Stable recovery of low-dimensional cones in Hilbert spaces: one RIP to rule them all
- Tail bounds via generic chaining
- The restricted isometry property and its implications for compressed sensing
- Uniform recovery of fusion frame structured sparse signals
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
- Universality of covariance matrices
- Universality results for the largest eigenvalues of some sample covariance matrix ensembles
Cited in
(2)
This page was built for publication: Sparse recovery from extreme eigenvalues deviation inequalities
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5228345)