A Simple Tool for Bounding the Deviation of Random Matrices on Geometric Sets
From MaRDI portal
Publication:5278301
DOI10.1007/978-3-319-45282-1_18zbMATH Open1366.60011arXiv1603.00897OpenAlexW2288884963MaRDI QIDQ5278301FDOQ5278301
Abbas Mehrabian, Roman Vershynin, Christopher Liaw, Y. Plan
Publication date: 13 July 2017
Published in: Lecture Notes in Mathematics (Search for Journal in Brave)
Abstract: Let be an isotropic, sub-gaussian matrix. We prove that the process has sub-gaussian increments. Using this, we show that for any bounded set , the deviation of around its mean is uniformly bounded by the Gaussian complexity of . We also prove a local version of this theorem, which allows for unbounded sets. These theorems have various applications, some of which are reviewed in this paper. In particular, we give a new result regarding model selection in the constrained linear model.
Full work available at URL: https://arxiv.org/abs/1603.00897
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Extensions of Lipschitz mappings into a Hilbert space
- Hanson-Wright inequality and sub-Gaussian concentration
- How close is the sample covariance matrix to the actual covariance matrix?
- The Generalized Lasso With Non-Linear Observations
- A mathematical introduction to compressive sensing
- The Generic Chaining
- Asymptotic Geometric Analysis, Part I
- The convex geometry of linear inverse problems
- Living on the edge: phase transitions in convex programs with random data
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Tail bounds via generic chaining
- Empirical processes and random projections
- Two observations regarding embedding subsets of Euclidean spaces in normed spaces
- Subspaces of Small Codimension of Finite-Dimensional Banach Spaces
- Robust 1-bit Compressed Sensing and Sparse Logistic Regression: A Convex Programming Approach
- Estimation in High Dimensions: A Geometric Perspective
- High-dimensional estimation with geometric constraints: Table 1.
Cited In (15)
- \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)
- Smoothed analysis for the condition number of structured real polynomial systems
- Robust and tuning-free sparse linear regression via square-root slope
- Covariance estimation under missing observations and \(L_4 - L_2\) moment equivalence
- Structure from Randomness in Halfspace Learning with the Zero-One Loss
- Sparse recovery from extreme eigenvalues deviation inequalities
- Dimension-free bounds for sums of independent matrices and simple tensors via the variational principle
- Matrix deviation inequality for ℓp-norm
- Uniform recovery guarantees for quantized corrupted sensing using structured or generative priors
- Dimension-free bounds for sums of dependent matrices and operators with heavy-tailed distributions
- \(\ell^1\)-analysis minimization and generalized (co-)sparsity: when does recovery succeed?
- Sampling rates for \(\ell^1\)-synthesis
- Title not available (Why is that?)
- An Introduction to Compressed Sensing
- Geometry of log-concave ensembles of random matrices and approximate reconstruction
This page was built for publication: A Simple Tool for Bounding the Deviation of Random Matrices on Geometric Sets
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5278301)