Dimensionality reduction with subgaussian matrices: a unified theory
DOI10.1007/S10208-015-9280-XzbMATH Open1360.60031arXiv1402.3973OpenAlexW1885794093MaRDI QIDQ515989FDOQ515989
Authors: Sjoerd Dirksen
Publication date: 17 March 2017
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1402.3973
Recommendations
- Toward a unified theory of sparse dimensionality reduction in Euclidean space
- Isometric sketching of any set via the restricted isometry property
- Toward a unified theory of sparse dimensionality reduction in Euclidean space
- Universality laws for randomized dimension reduction, with applications
- Rigorous restricted isometry property of low-dimensional subspaces
compressed sensingJohnson-Lindenstrauss embeddingsrandom dimensionality reductionrestricted isometry propertiessub-Gaussian matrices
Large deviations (60F10) Geometric probability and stochastic geometry (60D05) Probability in computer science (algorithm analysis, random structures, phase transitions, etc.) (68Q87)
Cites Work
- Finding the homology of submanifolds with high confidence from random samples
- Near-Optimal Signal Recovery From Random Projections: Universal Encoding Strategies?
- Extensions of Lipschitz mappings into a Hilbert space
- An elementary proof of a theorem of Johnson and Lindenstrauss
- Guaranteed minimum-rank solutions of linear matrix equations via nuclear norm minimization
- Tight Oracle Inequalities for Low-Rank Matrix Recovery From a Minimal Number of Noisy Random Measurements
- Oracle inequalities in empirical risk minimization and sparse recovery problems. École d'Été de Probabilités de Saint-Flour XXXVIII-2008.
- A simple proof of the restricted isometry property for random matrices
- Title not available (Why is that?)
- Compressed sensing
- Faster least squares approximation
- A mathematical introduction to compressive sensing
- Title not available (Why is that?)
- The Generic Chaining
- Random projections of smooth manifolds
- Title not available (Why is that?)
- Title not available (Why is that?)
- Sparse recovery under weak moment assumptions
- Uniform uncertainty principle for Bernoulli and subgaussian ensembles
- Problems and results in extremal combinatorics. I.
- The cosparse analysis model and algorithms
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- On variants of the Johnson–Lindenstrauss lemma
- Compressed Sensing Performance Bounds Under Poisson Noise
- Regularity of Gaussian processes
- Error and Perturbation Bounds for Subspaces Associated with Certain Eigenvalue Problems
- Majorizing measures without measures
- Reconstruction and subgaussian operators in asymptotic geometric analysis
- Tail bounds via generic chaining
- On the Performance of Clustering in Hilbert Spaces
- Robust Recovery of Signals From a Structured Union of Subspaces
- Low rank tensor recovery via iterative hard thresholding
- Distance preserving embeddings for general \(n\)-dimensional manifolds
- Tighter bounds for random projections of manifolds
- Nearest-neighbor-preserving embeddings
- Empirical processes and random projections
- A Theory for Sampling Signals From a Union of Subspaces
- Sampling Theorems for Signals From the Union of Finite-Dimensional Linear Subspaces
- Sampling and Reconstructing Signals From a Union of Linear Subspaces
- The Johnson-Lindenstrauss lemma and the sphericity of some graphs
- New analysis of manifold embeddings and signal recovery from compressive measurements
- The Johnson-Lindenstrauss lemma is optimal for linear dimensionality reduction
- Optimal bounds for Johnson-Lindenstrauss transforms and streaming problems with subconstant error
- Greedy-like algorithms for the cosparse analysis model
- Embeddings of surfaces, curves, and moving points in Euclidean space
- Clustering for edge-cost minimization (extended abstract)
- Compressed subspace matching on the continuum
- Geometry of log-concave ensembles of random matrices and approximate reconstruction
Cited In (24)
- Sparse control of alignment models in high dimension
- Sharp Estimates on Random Hyperplane Tessellations
- Compressed sensing with local structure: uniform recovery guarantees for the sparsity in levels class
- \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)
- On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary
- A unified approach to sufficient dimension reduction
- Random projections for conic programs
- Random projections for linear programming: an improved retrieval phase
- Dimension reduction by random hyperplane tessellations
- Persistent homology for low-complexity models
- Subspace projection: A unified framework for a class of partition-based dimension reduction techniques
- Compressive statistical learning with random feature moments
- Rigorous restricted isometry property of low-dimensional subspaces
- Low rank tensor recovery via iterative hard thresholding
- Uniform recovery guarantees for quantized corrupted sensing using structured or generative priors
- Improved analysis of the subsampled randomized Hadamard transform
- Robustness properties of dimensionality reduction with Gaussian random matrices
- Dimensionality reduction for \(k\)-distance applied to persistent homology
- Sample complexity bounds for the local convergence of least squares approximation
- Representation and coding of signal geometry
- Toward a unified theory of sparse dimensionality reduction in Euclidean space
- Uniform recovery of fusion frame structured sparse signals
- $N$-Dimensional Tensor Completion for Nuclear Magnetic Resonance Relaxometry
- Quantized Compressed Sensing: A Survey
This page was built for publication: Dimensionality reduction with subgaussian matrices: a unified theory
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q515989)