Dimensionality reduction with subgaussian matrices: a unified theory
From MaRDI portal
Publication:515989
DOI10.1007/s10208-015-9280-xzbMath1360.60031arXiv1402.3973OpenAlexW1885794093MaRDI QIDQ515989
Publication date: 17 March 2017
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1402.3973
compressed sensingJohnson-Lindenstrauss embeddingsrandom dimensionality reductionrestricted isometry propertiessub-Gaussian matrices
Geometric probability and stochastic geometry (60D05) Large deviations (60F10) Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) (68Q87)
Related Items (16)
Low rank tensor recovery via iterative hard thresholding ⋮ Compressive statistical learning with random feature moments ⋮ Compressed sensing with local structure: uniform recovery guarantees for the sparsity in levels class ⋮ $N$-Dimensional Tensor Completion for Nuclear Magnetic Resonance Relaxometry ⋮ \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\) ⋮ On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary ⋮ Representation and coding of signal geometry ⋮ Robustness properties of dimensionality reduction with Gaussian random matrices ⋮ Quantized Compressed Sensing: A Survey ⋮ Toward a unified theory of sparse dimensionality reduction in Euclidean space ⋮ Persistent homology for low-complexity models ⋮ Random projections for conic programs ⋮ Uniform recovery of fusion frame structured sparse signals ⋮ Dimensionality reduction for \(k\)-distance applied to persistent homology ⋮ Rigorous restricted isometry property of low-dimensional subspaces ⋮ Sparse control of alignment models in high dimension
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A mathematical introduction to compressive sensing
- Sparse recovery under weak moment assumptions
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Faster least squares approximation
- Geometry of log-concave ensembles of random matrices and approximate reconstruction
- Regularity of Gaussian processes
- Random projections of smooth manifolds
- A simple proof of the restricted isometry property for random matrices
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
- The Johnson-Lindenstrauss lemma and the sphericity of some graphs
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Problems and results in extremal combinatorics. I.
- The cosparse analysis model and algorithms
- Majorizing measures without measures
- New analysis of manifold embeddings and signal recovery from compressive measurements
- Low rank tensor recovery via iterative hard thresholding
- Greedy-like algorithms for the cosparse analysis model
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Finding the homology of submanifolds with high confidence from random samples
- Tail bounds via generic chaining
- Empirical processes and random projections
- Embeddings of Surfaces, Curves, and Moving Points in Euclidean Space
- Optimal Bounds for Johnson-Lindenstrauss Transforms and Streaming Problems with Subconstant Error
- Clustering for edge-cost minimization (extended abstract)
- Extensions of Lipschitz mappings into a Hilbert space
- On variants of the Johnson–Lindenstrauss lemma
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Nearest-neighbor-preserving embeddings
- Guaranteed Minimum-Rank Solutions of Linear Matrix Equations via Nuclear Norm Minimization
- Tighter bounds for random projections of manifolds
- On the Performance of Clustering in Hilbert Spaces
- Error and Perturbation Bounds for Subspaces Associated with Certain Eigenvalue Problems
- A Theory for Sampling Signals From a Union of Subspaces
- Compressed Sensing Performance Bounds Under Poisson Noise
- The Johnson-Lindenstrauss lemma is optimal for linear dimensionality reduction
- Compressed subspace matching on the continuum
- The Generic Chaining
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Robust Recovery of Signals From a Structured Union of Subspaces
- Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces
- Sampling and Reconstructing Signals From a Union of Linear Subspaces
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Compressed sensing
This page was built for publication: Dimensionality reduction with subgaussian matrices: a unified theory