An elementary proof of a theorem of Johnson and Lindenstrauss

From MaRDI portal
Revision as of 00:37, 8 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4798181

DOI10.1002/rsa.10073zbMath1018.51010OpenAlexW2088658556WikidataQ56115767 ScholiaQ56115767MaRDI QIDQ4798181

Sanjoy Dasgupta, Anupam Gupta

Publication date: 19 March 2003

Published in: Random Structures and Algorithms (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1002/rsa.10073




Related Items (only showing first 100 items - show all)

Johnson–Lindenstrauss Embeddings with Kronecker StructureGreedy Algorithm Almost Dominates in Smoothed Contextual BanditsTowards Practical Large-Scale Randomized Iterative Least Squares Solvers through Uncertainty QuantificationAn Outer-Product-of-Gradient Approach to Dimension Reduction and its Application to Classification in High Dimensional SpaceA preconditioned iterative interior point approach to the conic bundle subproblemStable probability of reduced matrix obtained by Gaussian random projection\( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundaryRandom Projection and Recovery for High Dimensional Optimization with Arbitrary OutliersThe effect of intrinsic dimension on the Bayes-error of projected quadratic discriminant classificationDistributed estimation and inference for spatial autoregression model with large scale networksOn the distance to low-rank matrices in the maximum normVisual Categorization with Random ProjectionUniform Error Estimates for the Lanczos MethodPartitioning Well-Clustered Graphs: Spectral Clustering Works!Random Projections for Linear ProgrammingQuasi-linear Compressed SensingUnnamed ItemConservative confidence intervals on multiple correlation coefficient for high-dimensional elliptical data using random projection methodologySide-constrained minimum sum-of-squares clustering: mathematical programming and random projectionsSampling quantum nonlocal correlations with high probabilityDeterministic Truncation of Linear MatroidsExponential random graphs behave like mixtures of stochastic block modelsGaussian random projections for Euclidean membership problemsA unified framework for linear dimensionality reduction in L1Dimension reduction in vertex-weighted exponential random graphsCompressive Sensing with Redundant Dictionaries and Structured MeasurementsFast Phase Retrieval from Local Correlation MeasurementsRandom fusion frames are nearly equiangular and tightTHE COEFFICIENT REGULARIZED REGRESSION WITH RANDOM PROJECTIONNumerical bifurcation analysis of PDEs from lattice Boltzmann model simulations: a parsimonious machine learning approachA Survey of Compressed SensingOvercomplete Order-3 Tensor Decomposition, Blind Deconvolution, and Gaussian Mixture ModelsPerformance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians ClusteringSparser Johnson-Lindenstrauss TransformsFar-field compression for fast kernel summation methods in high dimensionsRigorous RG algorithms and area laws for low energy eigenstates in 1DOptimal stable nonlinear approximationEquality, RevisitedOn the distance concentration awareness of certain data reduction techniquesGuarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argumentA simple test for zero multiple correlation coefficient in high-dimensional normal data using random projectionOn the structured backward error of inexact Arnoldi methods for (skew)-Hermitian and (skew)-symmetric eigenvalue problemsRandomized algorithms in numerical linear algebraDistance geometry and data scienceRandomized anisotropic transform for nonlinear dimensionality reductionThe geometry of off-the-grid compressed sensingData-independent random projections from the feature-map of the homogeneous polynomial kernel of degree twoAlgorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysisA variant of the Johnson-Lindenstrauss lemma for circulant matricesRandom projections of linear and semidefinite problems with linear inequalitiesRepresentation and coding of signal geometryTime for dithering: fast and quantized random embeddings via the restricted isometry propertyLiterature survey on low rank approximation of matricesOn variants of the Johnson–Lindenstrauss lemmaAcceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemmaLinear regression with sparsely permuted dataA Uniform Lower Error Bound for Half-Space LearningEVD dualdating based online subspace learningStructure from Randomness in Halfspace Learning with the Zero-One LossAn Introduction to Compressed SensingClassification Scheme for Binary Data with ExtensionsNear-optimal encoding for sigma-delta quantization of finite frame expansionsA central limit theorem for convex setsTargeted Random Projection for Prediction From High-Dimensional FeaturesStatistical mechanics of complex neural systems and high dimensional dataCompressive sensing using chaotic sequence based on Chebyshev mapUsing projection-based clustering to find distance- and density-based clusters in high-dimensional dataOn Geometric Prototype and ApplicationsBlessing of dimensionality: mathematical foundations of the statistical physics of dataEuclidean distance between Haar orthogonal and Gaussian matricesOn using Toeplitz and circulant matrices for Johnson-Lindenstrauss transformsHigh-dimensional approximate \(r\)-netsOn orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problemsDimensionality reduction with subgaussian matrices: a unified theoryFrames as CodesConvergence rates of learning algorithms by random projectionRandomized large distortion dimension reductionFast, linear time, \(m\)-adic hierarchical clustering for search and retrieval using the Baire metric, with linkages to generalized ultrametrics, hashing, formal concept analysis, and precision of data measurementTwo-dimensional random projectionApplications of dimensionality reduction and exponential sums to graph automorphismFast directional algorithms for the Helmholtz kernelSwarm intelligence for self-organized clusteringStatistical methods for tissue array images -- algorithmic scoring and co-trainingUnnamed ItemRandom Projection RBF Nets for Multidimensional Density EstimationDimension Reduction for Polynomials over Gaussian Space and ApplicationsThe Bottleneck Degree of Algebraic VarietiesA note on linear function approximation using random projectionsOn Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss TransformsSimple Classification using Binary DataOptimal Bounds for Johnson-Lindenstrauss TransformationsDe-noising by thresholding operator adapted waveletsAlmost Optimal Explicit Johnson-Lindenstrauss FamiliesBayesian random projection-based signal detection for Gaussian scale space random fieldsA Computationally Efficient Projection-Based Approach for Spatial Generalized Linear Mixed ModelsOptimal fast Johnson-Lindenstrauss embeddings for large data setsJohnson-Lindenstrauss lemma for circulant matrices**Dimensionality reduction for \(k\)-distance applied to persistent homologySparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization Approach




Cites Work




This page was built for publication: An elementary proof of a theorem of Johnson and Lindenstrauss