An elementary proof of a theorem of Johnson and Lindenstrauss
From MaRDI portal
Publication:4798181
DOI10.1002/RSA.10073zbMath1018.51010OpenAlexW2088658556WikidataQ56115767 ScholiaQ56115767MaRDI QIDQ4798181
Publication date: 19 March 2003
Published in: Random Structures and Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/rsa.10073
Related Items (only showing first 100 items - show all)
Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections ⋮ Sampling quantum nonlocal correlations with high probability ⋮ Deterministic Truncation of Linear Matroids ⋮ Exponential random graphs behave like mixtures of stochastic block models ⋮ Gaussian random projections for Euclidean membership problems ⋮ A unified framework for linear dimensionality reduction in L1 ⋮ Dimension reduction in vertex-weighted exponential random graphs ⋮ Compressive Sensing with Redundant Dictionaries and Structured Measurements ⋮ Fast Phase Retrieval from Local Correlation Measurements ⋮ Random fusion frames are nearly equiangular and tight ⋮ THE COEFFICIENT REGULARIZED REGRESSION WITH RANDOM PROJECTION ⋮ Numerical bifurcation analysis of PDEs from lattice Boltzmann model simulations: a parsimonious machine learning approach ⋮ A Survey of Compressed Sensing ⋮ Overcomplete Order-3 Tensor Decomposition, Blind Deconvolution, and Gaussian Mixture Models ⋮ Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering ⋮ Sparser Johnson-Lindenstrauss Transforms ⋮ Far-field compression for fast kernel summation methods in high dimensions ⋮ Rigorous RG algorithms and area laws for low energy eigenstates in 1D ⋮ Optimal stable nonlinear approximation ⋮ Equality, Revisited ⋮ On the distance concentration awareness of certain data reduction techniques ⋮ Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument ⋮ A simple test for zero multiple correlation coefficient in high-dimensional normal data using random projection ⋮ On the structured backward error of inexact Arnoldi methods for (skew)-Hermitian and (skew)-symmetric eigenvalue problems ⋮ Randomized algorithms in numerical linear algebra ⋮ Distance geometry and data science ⋮ Randomized anisotropic transform for nonlinear dimensionality reduction ⋮ The geometry of off-the-grid compressed sensing ⋮ Data-independent random projections from the feature-map of the homogeneous polynomial kernel of degree two ⋮ Algorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysis ⋮ A variant of the Johnson-Lindenstrauss lemma for circulant matrices ⋮ Random projections of linear and semidefinite problems with linear inequalities ⋮ Representation and coding of signal geometry ⋮ Time for dithering: fast and quantized random embeddings via the restricted isometry property ⋮ Literature survey on low rank approximation of matrices ⋮ On variants of the Johnson–Lindenstrauss lemma ⋮ Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma ⋮ Linear regression with sparsely permuted data ⋮ A Uniform Lower Error Bound for Half-Space Learning ⋮ EVD dualdating based online subspace learning ⋮ Structure from Randomness in Halfspace Learning with the Zero-One Loss ⋮ An Introduction to Compressed Sensing ⋮ Classification Scheme for Binary Data with Extensions ⋮ Near-optimal encoding for sigma-delta quantization of finite frame expansions ⋮ A central limit theorem for convex sets ⋮ Targeted Random Projection for Prediction From High-Dimensional Features ⋮ Statistical mechanics of complex neural systems and high dimensional data ⋮ Compressive sensing using chaotic sequence based on Chebyshev map ⋮ Using projection-based clustering to find distance- and density-based clusters in high-dimensional data ⋮ On Geometric Prototype and Applications ⋮ Blessing of dimensionality: mathematical foundations of the statistical physics of data ⋮ Euclidean distance between Haar orthogonal and Gaussian matrices ⋮ On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms ⋮ High-dimensional approximate \(r\)-nets ⋮ On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems ⋮ Dimensionality reduction with subgaussian matrices: a unified theory ⋮ Frames as Codes ⋮ Convergence rates of learning algorithms by random projection ⋮ Randomized large distortion dimension reduction ⋮ Fast, linear time, \(m\)-adic hierarchical clustering for search and retrieval using the Baire metric, with linkages to generalized ultrametrics, hashing, formal concept analysis, and precision of data measurement ⋮ Two-dimensional random projection ⋮ Applications of dimensionality reduction and exponential sums to graph automorphism ⋮ Fast directional algorithms for the Helmholtz kernel ⋮ Swarm intelligence for self-organized clustering ⋮ Statistical methods for tissue array images -- algorithmic scoring and co-training ⋮ Unnamed Item ⋮ Random Projection RBF Nets for Multidimensional Density Estimation ⋮ Dimension Reduction for Polynomials over Gaussian Space and Applications ⋮ The Bottleneck Degree of Algebraic Varieties ⋮ A note on linear function approximation using random projections ⋮ On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms ⋮ Simple Classification using Binary Data ⋮ Optimal Bounds for Johnson-Lindenstrauss Transformations ⋮ De-noising by thresholding operator adapted wavelets ⋮ Almost Optimal Explicit Johnson-Lindenstrauss Families ⋮ Bayesian random projection-based signal detection for Gaussian scale space random fields ⋮ A Computationally Efficient Projection-Based Approach for Spatial Generalized Linear Mixed Models ⋮ Optimal fast Johnson-Lindenstrauss embeddings for large data sets ⋮ Johnson-Lindenstrauss lemma for circulant matrices** ⋮ Dimensionality reduction for \(k\)-distance applied to persistent homology ⋮ Sparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization Approach ⋮ Random projections of smooth manifolds ⋮ How Accurately Should I Compute Implicit Matrix-Vector Products When Applying the Hutchinson Trace Estimator? ⋮ A Well-Tempered Landscape for Non-convex Robust Subspace Recovery ⋮ The complexity of quantum disjointness ⋮ High-dimensional model recovery from random sketched data by exploring intrinsic sparsity ⋮ Unnamed Item ⋮ A Powerful Bayesian Test for Equality of Means in High Dimensions ⋮ Fast and memory-optimal dimension reduction using Kac's walk ⋮ On the efficiency of Hamiltonian-based quantum computation for low-rank matrices ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Unnamed Item ⋮ Variance reduction in feature hashing using MLE and control variate method ⋮ Robust Width: A Characterization of Uniformly Stable and Robust Compressed Sensing ⋮ Efficient algorithms for privately releasing marginals via convex relaxations ⋮ Random projections and Hotelling’s T2 statistics for change detection in high-dimensional data streams ⋮ Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions ⋮ Two Models of Double Descent for Weak Features ⋮ Sparse control of alignment models in high dimension
Cites Work
This page was built for publication: An elementary proof of a theorem of Johnson and Lindenstrauss