New and Improved Johnson–Lindenstrauss Embeddings via the Restricted Isometry Property

From MaRDI portal
Publication:3097486


DOI10.1137/100810447zbMath1247.15019arXiv1009.0744MaRDI QIDQ3097486

Rachel Ward, Felix Krahmer

Publication date: 10 November 2011

Published in: SIAM Journal on Mathematical Analysis (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1009.0744


94A12: Signal theory (characterization, reconstruction, filtering, etc.)

15A45: Miscellaneous inequalities involving matrices

15B52: Random matrices (algebraic aspects)

15B34: Boolean and Hadamard matrices


Related Items

Unnamed Item, Dimensionality-reduced subspace clustering, Simple Classification using Binary Data, Unnamed Item, Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering, Fast deflation sparse principal component analysis via subspace projections, An investigation of Newton-Sketch and subsampled Newton methods, Persistent homology for low-complexity models, On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms, Minimization of the difference of Nuclear and Frobenius norms for noisy low rank matrix recovery, On the Atomic Decomposition of Coorbit Spaces with Non-integrable Kernel, Rendition: Reclaiming What a Black Box Takes Away, The Restricted Isometry Property of Subsampled Fourier Matrices, Simple Analyses of the Sparse Johnson-Lindenstrauss Transform., Unnamed Item, Dictionary-sparse recovery via thresholding-based algorithms, Lower Memory Oblivious (Tensor) Subspace Embeddings with Fewer Random Bits: Modewise Methods for Least Squares, Johnson–Lindenstrauss Embeddings with Kronecker Structure, Randomized numerical linear algebra: Foundations and algorithms, Compressed data separation via unconstrained l1-split analysis, \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\), On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary, Fast Metric Embedding into the Hamming Cube, Enhanced total variation minimization for stable image reconstruction, Derandomizing restricted isometries via the Legendre symbol, A unified framework for linear dimensionality reduction in L1, Restricted isometries for partial random circulant matrices, The restricted isometry property for random block diagonal matrices, Near-optimal encoding for sigma-delta quantization of finite frame expansions, Toward a unified theory of sparse dimensionality reduction in Euclidean space, Real-valued embeddings and sketches for fast distance and similarity estimation, Compressed sensing with coherent and redundant dictionaries, A variant of the Johnson-Lindenstrauss lemma for circulant matrices, A strong restricted isometry property, with an application to phaseless compressed sensing, On principal components regression, random projections, and column subsampling, Improved bounds for sparse recovery from subsampled random convolutions, Robustness properties of dimensionality reduction with Gaussian random matrices, Kernel conjugate gradient methods with random projections, On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms, On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems, Fast binary embeddings with Gaussian circulant matrices: improved bounds, Sparse recovery in bounded Riesz systems with applications to numerical methods for PDEs, Optimal fast Johnson-Lindenstrauss embeddings for large data sets, Dimensionality reduction for \(k\)-distance applied to persistent homology, Fast and memory-optimal dimension reduction using Kac's walk, GenMod: a generative modeling approach for spectral representation of PDEs with random inputs, Spectral estimation from simulations via sketching, The Hanson-Wright inequality for random tensors, Iterative hard thresholding for compressed data separation, Signal separation under coherent dictionaries and \(\ell_p\)-bounded noise, Convergence of projected Landweber iteration for matrix rank minimization, Sparse recovery with coherent tight frames via analysis Dantzig selector and analysis LASSO, Compressed dictionary learning, Sparse reconstruction with multiple Walsh matrices, Convergence and stability of iteratively reweighted least squares for low-rank matrix recovery, Low rank tensor recovery via iterative hard thresholding, Greedy-like algorithms for the cosparse analysis model, On deterministic sketching and streaming for sparse recovery and norm estimation, Fast and RIP-optimal transforms, Convergence analysis of projected gradient descent for Schatten-\(p\) nonconvex matrix recovery, Structure dependent sampling in compressed sensing: theoretical guarantees for tight frames, Generalized notions of sparsity and restricted isometry property. II: Applications, On randomized trace estimates for indefinite matrices with an application to determinants, Side effects of learning from low-dimensional data embedded in a Euclidean space, Compressed sensing of low-rank plus sparse matrices, Compressive Sensing, Theory and applications of compressed sensing, Suprema of Chaos Processes and the Restricted Isometry Property, Fast Phase Retrieval from Local Correlation Measurements, Sparser Johnson-Lindenstrauss Transforms, Classification Scheme for Binary Data with Extensions, Tighter Fourier Transform Lower Bounds, A Novel Compressed Sensing Scheme for Photoacoustic Tomography, Compressive Sensing with Redundant Dictionaries and Structured Measurements, A Survey of Compressed Sensing, The Quest for Optimal Sampling: Computationally Efficient, Structure-Exploiting Measurements for Compressed Sensing, Quantization and Compressive Sensing, Sparse Model Uncertainties in Compressed Sensing with Application to Convolutions and Sporadic Communication