Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform

From MaRDI portal
Publication:2931418

DOI10.1145/1132516.1132597zbMath1301.68232OpenAlexW2152402969WikidataQ57254826 ScholiaQ57254826MaRDI QIDQ2931418

Nir Ailon, Bernard Chazelle

Publication date: 25 November 2014

Published in: Proceedings of the thirty-eighth annual ACM symposium on Theory of Computing (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1145/1132516.1132597



Related Items

On principal components regression, random projections, and column subsampling, Johnson–Lindenstrauss Embeddings with Kronecker Structure, Randomized numerical linear algebra: Foundations and algorithms, Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections, Randomized Local Model Order Reduction, The Johnson-Lindenstrauss lemma almost characterizes Hilbert space, but not quite, Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering, On the mixing time of Kac's walk and other high-dimensional Gibbs samplers with constraints, Sharper Bounds for Regularized Data Fitting, Distance geometry and data science, Principled interpolation of Green's functions learned from data, Faster least squares approximation, Sparsified randomization algorithms for low rank approximations and applications to integral equations and inhomogeneous random field simulation, Practical Sketching Algorithms for Low-Rank Matrix Approximation, Random projections for quadratic programs, On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary, Algorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysis, A variant of the Johnson-Lindenstrauss lemma for circulant matrices, Dense fast random projections and Lean Walsh transforms, Detecting low-rank clusters via random sampling, Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence, On variants of the Johnson–Lindenstrauss lemma, Approximate nearest neighbor search for \(\ell_{p}\)-spaces \((2 < p < \infty)\) via embeddings, Classification Scheme for Binary Data with Extensions, A Practical Randomized CP Tensor Decomposition, Solving LP using random projections, A fast randomized algorithm for the approximation of matrices, The Mailman algorithm: a note on matrix-vector multiplication, Unnamed Item, Random Projection RBF Nets for Multidimensional Density Estimation, Dimension reduction for hyperbolic space, Fast Private Norm Estimation and Heavy Hitters, Randomized interpolative decomposition of separated representations, Geometric component analysis and its applications to data analysis, Interpolation of Inverse Operators for Preconditioning Parameter-Dependent Equations, Simple Classification using Binary Data, Optimal Bounds for Johnson-Lindenstrauss Transformations, Compressed and Penalized Linear Regression, Optimal fast Johnson-Lindenstrauss embeddings for large data sets, Johnson-Lindenstrauss lemma for circulant matrices**, Random projections for the nonnegative least-squares problem, Fast dimension reduction using Rademacher series on dual BCH codes, High-dimensional model recovery from random sketched data by exploring intrinsic sparsity, Estimating Leverage Scores via Rank Revealing Methods and Randomization, Unnamed Item, Unnamed Item, Unnamed Item, Tensor-Structured Sketching for Constrained Least Squares, Random projections and Hotelling’s T2 statistics for change detection in high-dimensional data streams, Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions