New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
From MaRDI portal
Publication:3097486
Abstract: Consider an m by N matrix Phi with the Restricted Isometry Property of order k and level delta, that is, the norm of any k-sparse vector in R^N is preserved to within a multiplicative factor of 1 +- delta under application of Phi. We show that by randomizing the column signs of such a matrix Phi, the resulting map with high probability embeds any fixed set of p = O(e^k) points in R^N into R^m without distorting the norm of any point in the set by more than a factor of 1 +- delta. Consequently, matrices with the Restricted Isometry Property and with randomized column signs provide optimal Johnson-Lindenstrauss embeddings up to logarithmic factors in N. In particular, our results improve the best known bounds on the necessary embedding dimension m for a wide class of structured random matrices; for partial Fourier and partial Hadamard matrices, we improve the recent bound m = O(delta^(-4) log(p) log^4(N)) appearing in Ailon and Liberty to m = O(delta^(-2) log(p) log^4(N)), which is optimal up to the logarithmic factors in N. Our results also have a direct application in the area of compressed sensing for redundant dictionaries.
Recommendations
- New constructions of RIP matrices with fast multiplication and fewer rows
- Fast and RIP-optimal transforms
- Sparsity lower bounds for dimensionality reducing maps
- Sparser Johnson-Lindenstrauss transforms
- Rigorous restricted isometry property of low-dimensional subspaces
- Johnson–Lindenstrauss Embeddings with Kronecker Structure
- Isometric sketching of any set via the restricted isometry property
- Lower bounds on the low-distortion embedding dimension of submanifolds of \(\mathbb{R}^n\)
- Restricted isometry property for random matrices with heavy-tailed columns
- Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument
Cited in
(91)- Applied harmonic analysis and data science. Abstracts from the workshop held April 21--26, 2024
- On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary
- Classification scheme for binary data with extensions
- Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering
- Fast deflation sparse principal component analysis via subspace projections
- Side effects of learning from low-dimensional data embedded in a Euclidean space
- Johnson–Lindenstrauss Embeddings with Kronecker Structure
- \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)
- Simple analyses of the sparse Johnson-Lindenstrauss transform
- Time for dithering: fast and quantized random embeddings via the restricted isometry property
- Compressed data separation via \(\ell_q\)-split analysis with \(\ell_\infty\)-constraint
- Compressed data separation via unconstrained l1-split analysis
- Fast Metric Embedding into the Hamming Cube
- Rendition: reclaiming what a black box takes away
- Convergence on thresholding-based algorithms for dictionary-sparse recovery
- Simple classification using binary data
- Theory and applications of compressed sensing
- A novel compressed sensing scheme for photoacoustic tomography
- Sparser Johnson-Lindenstrauss transforms
- Compressive Sensing
- The restricted isometry property for random block diagonal matrices
- Optimal fast Johnson-Lindenstrauss embeddings for large data sets
- Dimensionality reduction for \(k\)-distance applied to persistent homology
- New bounds for circulant Johnson-Lindenstrauss embeddings
- On randomized trace estimates for indefinite matrices with an application to determinants
- Sub-Gaussian matrices on sets: optimal tail dependence and applications
- Compressed sensing of low-rank plus sparse matrices
- Enhanced total variation minimization for stable image reconstruction
- On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms
- Convergences of regularized algorithms and stochastic gradient methods with random projections
- On deterministic sketching and streaming for sparse recovery and norm estimation
- Fast and RIP-optimal transforms
- Compressive sensing with redundant dictionaries and structured measurements
- Sparsity lower bounds for dimensionality reducing maps
- Quantization and compressive sensing
- Isometric sketching of any set via the restricted isometry property
- On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms
- On the atomic decomposition of coorbit spaces with non-integrable kernel
- Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs
- Compressed dictionary learning
- Improved analysis of the subsampled randomized Hadamard transform
- An investigation of Newton-sketch and subsampled Newton methods
- Suprema of chaos processes and the restricted isometry property
- Fast binary embeddings with Gaussian circulant matrices: improved bounds
- Greedy-like algorithms for the cosparse analysis model
- Sparse model uncertainties in compressed sensing with application to convolutions and sporadic communication
- Spectral estimation from simulations via sketching
- On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems
- Persistent homology for low-complexity models
- Restricted isometries for partial random circulant matrices
- Convergence of projected Landweber iteration for matrix rank minimization
- The Hanson-Wright inequality for random tensors
- The quest for optimal sampling: computationally efficient, structure-exploiting measurements for compressed sensing
- Sparsity and non-Euclidean embeddings
- Kernel conjugate gradient methods with random projections
- Dictionary-sparse recovery via thresholding-based algorithms
- Compressed sensing with coherent and redundant dictionaries
- Minimization of the difference of nuclear and Frobenius norms for noisy low rank matrix recovery
- Tighter Fourier transform lower bounds
- Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO
- On principal components regression, random projections, and column subsampling
- Improved bounds for sparse recovery from subsampled random convolutions
- Low rank tensor recovery via iterative hard thresholding
- Structure dependent sampling in compressed sensing: theoretical guarantees for tight frames
- Convergence and stability of iteratively reweighted least squares for low-rank matrix recovery
- Convergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recovery
- A unified framework for linear dimensionality reduction in L1
- Iterative hard thresholding for compressed data separation
- Randomized numerical linear algebra: Foundations and algorithms
- Dimensionality-reduced subspace clustering
- Robustness properties of dimensionality reduction with Gaussian random matrices
- Lower bounds on the low-distortion embedding dimension of submanifolds of \(\mathbb{R}^n\)
- New constructions of RIP matrices with fast multiplication and fewer rows
- Sparse reconstruction with multiple Walsh matrices
- Fast Phase Retrieval from Local Correlation Measurements
- Fast cross-polytope locality-sensitive hashing
- Rigorous restricted isometry property of low-dimensional subspaces
- Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares
- Near-optimal encoding for sigma-delta quantization of finite frame expansions
- Sparser Johnson-Lindenstrauss transforms
- GenMod: a generative modeling approach for spectral representation of PDEs with random inputs
- Real-valued embeddings and sketches for fast distance and similarity estimation
- A simple proof of the restricted isometry property for random matrices
- Derandomizing restricted isometries via the Legendre symbol
- A variant of the Johnson-Lindenstrauss lemma for circulant matrices
- A strong restricted isometry property, with an application to phaseless compressed sensing
- A survey of compressed sensing
- Generalized notions of sparsity and restricted isometry property. II: Applications
- The restricted isometry property of subsampled Fourier matrices
- Fast and memory-optimal dimension reduction using Kac's walk
- Signal separation under coherent dictionaries and \(\ell_p\)-bounded noise
This page was built for publication: New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3097486)