New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
DOI10.1137/100810447zbMATH Open1247.15019arXiv1009.0744OpenAlexW2963262327MaRDI QIDQ3097486FDOQ3097486
Authors: Felix Krahmer, Rachel Ward
Publication date: 10 November 2011
Published in: SIAM Journal on Mathematical Analysis (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1009.0744
Recommendations
- New constructions of RIP matrices with fast multiplication and fewer rows
- Fast and RIP-optimal transforms
- Sparsity lower bounds for dimensionality reducing maps
- Sparser Johnson-Lindenstrauss transforms
- Rigorous restricted isometry property of low-dimensional subspaces
- Johnson–Lindenstrauss Embeddings with Kronecker Structure
- Isometric sketching of any set via the restricted isometry property
- Lower bounds on the low-distortion embedding dimension of submanifolds of \(\mathbb{R}^n\)
- Restricted isometry property for random matrices with heavy-tailed columns
- Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument
Rademacher chaosrestricted isometry propertyJohnson-Lindenstrauss lemmapartial Hadamard matricesoptimal asymptoticspartial Fourier matrices
Signal theory (characterization, reconstruction, filtering, etc.) (94A12) Random matrices (algebraic aspects) (15B52) Boolean and Hadamard matrices (15B34) Miscellaneous inequalities involving matrices (15A45)
Cited In (91)
- \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)
- On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary
- Fast deflation sparse principal component analysis via subspace projections
- Fast Metric Embedding into the Hamming Cube
- Compressed data separation via \(\ell_q\)-split analysis with \(\ell_\infty\)-constraint
- Compressed data separation via unconstrained l1-split analysis
- Rendition: reclaiming what a black box takes away
- Simple analyses of the sparse Johnson-Lindenstrauss transform
- Johnson–Lindenstrauss Embeddings with Kronecker Structure
- Side effects of learning from low-dimensional data embedded in a Euclidean space
- Time for dithering: fast and quantized random embeddings via the restricted isometry property
- Convergence on thresholding-based algorithms for dictionary-sparse recovery
- Applied harmonic analysis and data science. Abstracts from the workshop held April 21--26, 2024
- Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering
- Simple classification using binary data
- Classification scheme for binary data with extensions
- Compressive Sensing
- Sparse model uncertainties in compressed sensing with application to convolutions and sporadic communication
- On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems
- Tighter Fourier transform lower bounds
- Convergence and stability of iteratively reweighted least squares for low-rank matrix recovery
- Sparser Johnson-Lindenstrauss transforms
- GenMod: a generative modeling approach for spectral representation of PDEs with random inputs
- A simple proof of the restricted isometry property for random matrices
- Sub-Gaussian matrices on sets: optimal tail dependence and applications
- Fast binary embeddings with Gaussian circulant matrices: improved bounds
- Convergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recovery
- Signal separation under coherent dictionaries and \(\ell_p\)-bounded noise
- Compressed dictionary learning
- On principal components regression, random projections, and column subsampling
- A variant of the Johnson-Lindenstrauss lemma for circulant matrices
- Isometric sketching of any set via the restricted isometry property
- Randomized numerical linear algebra: Foundations and algorithms
- An investigation of Newton-sketch and subsampled Newton methods
- The quest for optimal sampling: computationally efficient, structure-exploiting measurements for compressed sensing
- The Hanson-Wright inequality for random tensors
- Fast cross-polytope locality-sensitive hashing
- Fast Phase Retrieval from Local Correlation Measurements
- Persistent homology for low-complexity models
- Generalized notions of sparsity and restricted isometry property. II: Applications
- On deterministic sketching and streaming for sparse recovery and norm estimation
- Quantization and compressive sensing
- Derandomizing restricted isometries via the Legendre symbol
- Theory and applications of compressed sensing
- Fast and RIP-optimal transforms
- Spectral estimation from simulations via sketching
- New constructions of RIP matrices with fast multiplication and fewer rows
- Greedy-like algorithms for the cosparse analysis model
- Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO
- The restricted isometry property of subsampled Fourier matrices
- On randomized trace estimates for indefinite matrices with an application to determinants
- Convergence of projected Landweber iteration for matrix rank minimization
- Structure dependent sampling in compressed sensing: theoretical guarantees for tight frames
- A unified framework for linear dimensionality reduction in L1
- Dimensionality-reduced subspace clustering
- Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares
- Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs
- Rigorous restricted isometry property of low-dimensional subspaces
- Compressed sensing with coherent and redundant dictionaries
- Low rank tensor recovery via iterative hard thresholding
- A strong restricted isometry property, with an application to phaseless compressed sensing
- Improved analysis of the subsampled randomized Hadamard transform
- Restricted isometries for partial random circulant matrices
- Sparse reconstruction with multiple Walsh matrices
- The restricted isometry property for random block diagonal matrices
- New bounds for circulant Johnson-Lindenstrauss embeddings
- On the atomic decomposition of coorbit spaces with non-integrable kernel
- Robustness properties of dimensionality reduction with Gaussian random matrices
- Near-optimal encoding for sigma-delta quantization of finite frame expansions
- Optimal fast Johnson-Lindenstrauss embeddings for large data sets
- Dimensionality reduction for \(k\)-distance applied to persistent homology
- Dictionary-sparse recovery via thresholding-based algorithms
- A survey of compressed sensing
- Enhanced total variation minimization for stable image reconstruction
- Minimization of the difference of nuclear and Frobenius norms for noisy low rank matrix recovery
- Compressed sensing of low-rank plus sparse matrices
- Iterative hard thresholding for compressed data separation
- Lower bounds on the low-distortion embedding dimension of submanifolds of \(\mathbb{R}^n\)
- Improved bounds for sparse recovery from subsampled random convolutions
- Real-valued embeddings and sketches for fast distance and similarity estimation
- Sparser Johnson-Lindenstrauss transforms
- Sparsity and non-Euclidean embeddings
- A novel compressed sensing scheme for photoacoustic tomography
- On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms
- Sparsity lower bounds for dimensionality reducing maps
- Compressive sensing with redundant dictionaries and structured measurements
- On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms
- Suprema of chaos processes and the restricted isometry property
- Convergences of regularized algorithms and stochastic gradient methods with random projections
- Kernel conjugate gradient methods with random projections
- Fast and memory-optimal dimension reduction using Kac's walk
This page was built for publication: New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3097486)