The Fast Johnson–Lindenstrauss Transform and Approximate Nearest Neighbors

From MaRDI portal
Publication:5189547


DOI10.1137/060673096zbMath1185.68327WikidataQ57258817 ScholiaQ57258817MaRDI QIDQ5189547

Nir Ailon, Bernard Chazelle

Publication date: 17 March 2010

Published in: SIAM Journal on Computing (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1137/060673096


68Q01: General topics in the theory of computing


Related Items

Unnamed Item, Practical Sketching Algorithms for Low-Rank Matrix Approximation, Constructing a High-Dimensional k NN-Graph Using a Z-Order Curve, Optimal (Euclidean) Metric Compression, Sketched Newton--Raphson, RidgeSketch: A Fast Sketching Based Solver for Large Scale Ridge Regression, On Geometric Prototype and Applications, On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms, On Closest Pair in Euclidean Metric: Monochromatic is as Hard as Bichromatic, Streaming Low-Rank Matrix Approximation with an Application to Scientific Simulation, The Restricted Isometry Property of Subsampled Fourier Matrices, Dominance Product and High-Dimensional Closest Pair under L_infty, Structural Convergence Results for Approximation of Dominant Subspaces from Block Krylov Spaces, Simple Analyses of the Sparse Johnson-Lindenstrauss Transform., Randomized numerical linear algebra: Foundations and algorithms, Near-Neighbor Preserving Dimension Reduction for Doubling Subsets of L1, PLSS: A Projected Linear Systems Solver, Towards Practical Large-Scale Randomized Iterative Least Squares Solvers through Uncertainty Quantification, Randomized algorithms for the computation of multilinear rank-\((\mu_1,\mu_2,\mu_3)\) approximations, Principled interpolation of Green's functions learned from data, Generalized linear models for massive data via doubly-sketching, \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\), On randomized sketching algorithms and the Tracy-Widom law, Random Projection and Recovery for High Dimensional Optimization with Arbitrary Outliers, Random design analysis of ridge regression, Toward a unified theory of sparse dimensionality reduction in Euclidean space, Real-valued embeddings and sketches for fast distance and similarity estimation, Compressed sensing with coherent and redundant dictionaries, A variant of the Johnson-Lindenstrauss lemma for circulant matrices, Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma, User-friendly tail bounds for sums of random matrices, High-dimensional model recovery from random sketched data by exploring intrinsic sparsity, Matrix sketching for supervised classification with imbalanced classes, Randomized LU decomposition, Forecasting using random subspace methods, On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms, Fast binary embeddings with Gaussian circulant matrices: improved bounds, Randomized LU decomposition using sparse projections, Fast randomized matrix and tensor interpolative decomposition using countsketch, Multiplicative perturbation bounds for multivariate multiple linear regression in Schatten \(p\)-norms, Correlations between random projections and the bivariate normal, Stochastic quasi-gradient methods: variance reduction via Jacobian sketching, An efficient randomized algorithm for computing the approximate Tucker decomposition, Dimensionality reduction for \(k\)-distance applied to persistent homology, Fast and memory-optimal dimension reduction using Kac's walk, Near-neighbor preserving dimension reduction via coverings for doubling subsets of \(\ell_1\), Spectral estimation from simulations via sketching, Filament plots for data visualization, An efficient algorithm for computing the approximate t-URV and its applications, Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument, Linear dimension reduction approximately preserving a function of the $1$-norm, On closest pair in Euclidean metric: monochromatic is as hard as bichromatic, Dimension reduction by random hyperplane tessellations, Paraunitary matrices, entropy, algebraic condition number and Fourier computation, Compressed dictionary learning, Randomized partition trees for nearest neighbor search, Binary vectors for fast distance and similarity estimation, On the convergence to equilibrium of Kac's random walk on matrices, On deterministic sketching and streaming for sparse recovery and norm estimation, Paved with good intentions: analysis of a randomized block Kaczmarz method, Binary random projections with controllable sparsity patterns, An efficient superpostional quantum Johnson-Lindenstrauss lemma via unitary \(t\)-designs, Compressed sensing of low-rank plus sparse matrices, The Fast Cauchy Transform and Faster Robust Linear Regression, Stochastic iterative projection methods for large linear systems, Stochastic Algorithms in Linear Algebra - beyond the Markov Chains and von Neumann - Ulam Scheme, Almost Optimal Explicit Johnson-Lindenstrauss Families, Johnson-Lindenstrauss lemma for circulant matrices**, Sparser Johnson-Lindenstrauss Transforms, Tighter Fourier Transform Lower Bounds, Compressive Sensing with Redundant Dictionaries and Structured Measurements