An elementary proof of a theorem of Johnson and Lindenstrauss
From MaRDI portal
Publication:4798181
DOI10.1002/rsa.10073zbMath1018.51010OpenAlexW2088658556WikidataQ56115767 ScholiaQ56115767MaRDI QIDQ4798181
Publication date: 19 March 2003
Published in: Random Structures and Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/rsa.10073
Related Items (only showing first 100 items - show all)
Johnson–Lindenstrauss Embeddings with Kronecker Structure ⋮ Greedy Algorithm Almost Dominates in Smoothed Contextual Bandits ⋮ Towards Practical Large-Scale Randomized Iterative Least Squares Solvers through Uncertainty Quantification ⋮ An Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional Space ⋮ A preconditioned iterative interior point approach to the conic bundle subproblem ⋮ Stable probability of reduced matrix obtained by Gaussian random projection ⋮ \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\) ⋮ On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary ⋮ Random Projection and Recovery for High Dimensional Optimization with Arbitrary Outliers ⋮ The effect of intrinsic dimension on the Bayes-error of projected quadratic discriminant classification ⋮ Distributed estimation and inference for spatial autoregression model with large scale networks ⋮ On the distance to low-rank matrices in the maximum norm ⋮ Visual Categorization with Random Projection ⋮ Uniform Error Estimates for the Lanczos Method ⋮ Partitioning Well-Clustered Graphs: Spectral Clustering Works! ⋮ Random Projections for Linear Programming ⋮ Quasi-linear Compressed Sensing ⋮ Unnamed Item ⋮ Conservative confidence intervals on multiple correlation coefficient for high-dimensional elliptical data using random projection methodology ⋮ Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections ⋮ Sampling quantum nonlocal correlations with high probability ⋮ Deterministic Truncation of Linear Matroids ⋮ Exponential random graphs behave like mixtures of stochastic block models ⋮ Gaussian random projections for Euclidean membership problems ⋮ A unified framework for linear dimensionality reduction in L1 ⋮ Dimension reduction in vertex-weighted exponential random graphs ⋮ Compressive Sensing with Redundant Dictionaries and Structured Measurements ⋮ Fast Phase Retrieval from Local Correlation Measurements ⋮ Random fusion frames are nearly equiangular and tight ⋮ THE COEFFICIENT REGULARIZED REGRESSION WITH RANDOM PROJECTION ⋮ Numerical bifurcation analysis of PDEs from lattice Boltzmann model simulations: a parsimonious machine learning approach ⋮ A Survey of Compressed Sensing ⋮ Overcomplete Order-3 Tensor Decomposition, Blind Deconvolution, and Gaussian Mixture Models ⋮ Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering ⋮ Sparser Johnson-Lindenstrauss Transforms ⋮ Far-field compression for fast kernel summation methods in high dimensions ⋮ Rigorous RG algorithms and area laws for low energy eigenstates in 1D ⋮ Optimal stable nonlinear approximation ⋮ Equality, Revisited ⋮ On the distance concentration awareness of certain data reduction techniques ⋮ Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument ⋮ A simple test for zero multiple correlation coefficient in high-dimensional normal data using random projection ⋮ On the structured backward error of inexact Arnoldi methods for (skew)-Hermitian and (skew)-symmetric eigenvalue problems ⋮ Randomized algorithms in numerical linear algebra ⋮ Distance geometry and data science ⋮ Randomized anisotropic transform for nonlinear dimensionality reduction ⋮ The geometry of off-the-grid compressed sensing ⋮ Data-independent random projections from the feature-map of the homogeneous polynomial kernel of degree two ⋮ Algorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysis ⋮ A variant of the Johnson-Lindenstrauss lemma for circulant matrices ⋮ Random projections of linear and semidefinite problems with linear inequalities ⋮ Representation and coding of signal geometry ⋮ Time for dithering: fast and quantized random embeddings via the restricted isometry property ⋮ Literature survey on low rank approximation of matrices ⋮ On variants of the Johnson–Lindenstrauss lemma ⋮ Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma ⋮ Linear regression with sparsely permuted data ⋮ A Uniform Lower Error Bound for Half-Space Learning ⋮ EVD dualdating based online subspace learning ⋮ Structure from Randomness in Halfspace Learning with the Zero-One Loss ⋮ An Introduction to Compressed Sensing ⋮ Classification Scheme for Binary Data with Extensions ⋮ Near-optimal encoding for sigma-delta quantization of finite frame expansions ⋮ A central limit theorem for convex sets ⋮ Targeted Random Projection for Prediction From High-Dimensional Features ⋮ Statistical mechanics of complex neural systems and high dimensional data ⋮ Compressive sensing using chaotic sequence based on Chebyshev map ⋮ Using projection-based clustering to find distance- and density-based clusters in high-dimensional data ⋮ On Geometric Prototype and Applications ⋮ Blessing of dimensionality: mathematical foundations of the statistical physics of data ⋮ Euclidean distance between Haar orthogonal and Gaussian matrices ⋮ On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms ⋮ High-dimensional approximate \(r\)-nets ⋮ On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems ⋮ Dimensionality reduction with subgaussian matrices: a unified theory ⋮ Frames as Codes ⋮ Convergence rates of learning algorithms by random projection ⋮ Randomized large distortion dimension reduction ⋮ Fast, linear time, \(m\)-adic hierarchical clustering for search and retrieval using the Baire metric, with linkages to generalized ultrametrics, hashing, formal concept analysis, and precision of data measurement ⋮ Two-dimensional random projection ⋮ Applications of dimensionality reduction and exponential sums to graph automorphism ⋮ Fast directional algorithms for the Helmholtz kernel ⋮ Swarm intelligence for self-organized clustering ⋮ Statistical methods for tissue array images -- algorithmic scoring and co-training ⋮ Unnamed Item ⋮ Random Projection RBF Nets for Multidimensional Density Estimation ⋮ Dimension Reduction for Polynomials over Gaussian Space and Applications ⋮ The Bottleneck Degree of Algebraic Varieties ⋮ A note on linear function approximation using random projections ⋮ On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms ⋮ Simple Classification using Binary Data ⋮ Optimal Bounds for Johnson-Lindenstrauss Transformations ⋮ De-noising by thresholding operator adapted wavelets ⋮ Almost Optimal Explicit Johnson-Lindenstrauss Families ⋮ Bayesian random projection-based signal detection for Gaussian scale space random fields ⋮ A Computationally Efficient Projection-Based Approach for Spatial Generalized Linear Mixed Models ⋮ Optimal fast Johnson-Lindenstrauss embeddings for large data sets ⋮ Johnson-Lindenstrauss lemma for circulant matrices** ⋮ Dimensionality reduction for \(k\)-distance applied to persistent homology ⋮ Sparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization Approach
Cites Work
This page was built for publication: An elementary proof of a theorem of Johnson and Lindenstrauss