Dimensionality reduction with subgaussian matrices: a unified theory
From MaRDI portal
Publication:515989
Abstract: We present a theory for Euclidean dimensionality reduction with subgaussian matrices which unifies several restricted isometry property and Johnson-Lindenstrauss type results obtained earlier for specific data sets. In particular, we recover and, in several cases, improve results for sets of sparse and structured sparse vectors, low-rank matrices and tensors, and smooth manifolds. In addition, we establish a new Johnson-Lindenstrauss embedding for data sets taking the form of an infinite union of subspaces of a Hilbert space.
Recommendations
Cites work
- scientific article; zbMATH DE number 1077335 (Why is no real title available?)
- scientific article; zbMATH DE number 1775450 (Why is no real title available?)
- scientific article; zbMATH DE number 2109363 (Why is no real title available?)
- scientific article; zbMATH DE number 6276198 (Why is no real title available?)
- A Theory for Sampling Signals From a Union of Subspaces
- A mathematical introduction to compressive sensing
- A simple proof of the restricted isometry property for random matrices
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Clustering for edge-cost minimization (extended abstract)
- Compressed Sensing Performance Bounds Under Poisson Noise
- Compressed sensing
- Compressed subspace matching on the continuum
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Distance preserving embeddings for general \(n\)-dimensional manifolds
- Embeddings of surfaces, curves, and moving points in Euclidean space
- Empirical processes and random projections
- Error and Perturbation Bounds for Subspaces Associated with Certain Eigenvalue Problems
- Extensions of Lipschitz mappings into a Hilbert space
- Faster least squares approximation
- Finding the homology of submanifolds with high confidence from random samples
- Geometry of log-concave ensembles of random matrices and approximate reconstruction
- Greedy-like algorithms for the cosparse analysis model
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Low rank tensor recovery via iterative hard thresholding
- Majorizing measures without measures
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Nearest-neighbor-preserving embeddings
- New analysis of manifold embeddings and signal recovery from compressive measurements
- On the Performance of Clustering in Hilbert Spaces
- On variants of the Johnson–Lindenstrauss lemma
- Optimal bounds for Johnson-Lindenstrauss transforms and streaming problems with subconstant error
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- Problems and results in extremal combinatorics. I.
- Random projections of smooth manifolds
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Regularity of Gaussian processes
- Robust Recovery of Signals From a Structured Union of Subspaces
- Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces
- Sampling and Reconstructing Signals From a Union of Linear Subspaces
- Sparse recovery under weak moment assumptions
- Tail bounds via generic chaining
- The Generic Chaining
- The Johnson-Lindenstrauss lemma and the sphericity of some graphs
- The Johnson-Lindenstrauss lemma is optimal for linear dimensionality reduction
- The cosparse analysis model and algorithms
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Tighter bounds for random projections of manifolds
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
Cited in
(26)- Sharp Estimates on Random Hyperplane Tessellations
- Subspace projection: A unified framework for a class of partition-based dimension reduction techniques
- Uniform recovery guarantees for quantized corrupted sensing using structured or generative priors
- On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary
- Dimensionality reduction for \(k\)-distance applied to persistent homology
- Sub-Gaussian matrices on sets: optimal tail dependence and applications
- Random projections for conic programs
- Universality laws for randomized dimension reduction, with applications
- Isometric sketching of any set via the restricted isometry property
- Representation and coding of signal geometry
- Improved analysis of the subsampled randomized Hadamard transform
- Compressed sensing with local structure: uniform recovery guarantees for the sparsity in levels class
- Random projections for linear programming: an improved retrieval phase
- Sample complexity bounds for the local convergence of least squares approximation
- Compressive statistical learning with random feature moments
- \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)
- $N$-Dimensional Tensor Completion for Nuclear Magnetic Resonance Relaxometry
- A unified approach to sufficient dimension reduction
- Persistent homology for low-complexity models
- Low rank tensor recovery via iterative hard thresholding
- Robustness properties of dimensionality reduction with Gaussian random matrices
- Uniform recovery of fusion frame structured sparse signals
- Rigorous restricted isometry property of low-dimensional subspaces
- Sparse control of alignment models in high dimension
- Dimension reduction by random hyperplane tessellations
- Quantized compressed sensing: a survey
This page was built for publication: Dimensionality reduction with subgaussian matrices: a unified theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q515989)