An elementary proof of a theorem of Johnson and Lindenstrauss

From MaRDI portal
Revision as of 00:37, 8 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:4798181

DOI10.1002/RSA.10073zbMath1018.51010OpenAlexW2088658556WikidataQ56115767 ScholiaQ56115767MaRDI QIDQ4798181

Sanjoy Dasgupta, Anupam Gupta

Publication date: 19 March 2003

Published in: Random Structures and Algorithms (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1002/rsa.10073




Related Items (only showing first 100 items - show all)

Side-constrained minimum sum-of-squares clustering: mathematical programming and random projectionsSampling quantum nonlocal correlations with high probabilityDeterministic Truncation of Linear MatroidsExponential random graphs behave like mixtures of stochastic block modelsGaussian random projections for Euclidean membership problemsA unified framework for linear dimensionality reduction in L1Dimension reduction in vertex-weighted exponential random graphsCompressive Sensing with Redundant Dictionaries and Structured MeasurementsFast Phase Retrieval from Local Correlation MeasurementsRandom fusion frames are nearly equiangular and tightTHE COEFFICIENT REGULARIZED REGRESSION WITH RANDOM PROJECTIONNumerical bifurcation analysis of PDEs from lattice Boltzmann model simulations: a parsimonious machine learning approachA Survey of Compressed SensingOvercomplete Order-3 Tensor Decomposition, Blind Deconvolution, and Gaussian Mixture ModelsPerformance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians ClusteringSparser Johnson-Lindenstrauss TransformsFar-field compression for fast kernel summation methods in high dimensionsRigorous RG algorithms and area laws for low energy eigenstates in 1DOptimal stable nonlinear approximationEquality, RevisitedOn the distance concentration awareness of certain data reduction techniquesGuarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argumentA simple test for zero multiple correlation coefficient in high-dimensional normal data using random projectionOn the structured backward error of inexact Arnoldi methods for (skew)-Hermitian and (skew)-symmetric eigenvalue problemsRandomized algorithms in numerical linear algebraDistance geometry and data scienceRandomized anisotropic transform for nonlinear dimensionality reductionThe geometry of off-the-grid compressed sensingData-independent random projections from the feature-map of the homogeneous polynomial kernel of degree twoAlgorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysisA variant of the Johnson-Lindenstrauss lemma for circulant matricesRandom projections of linear and semidefinite problems with linear inequalitiesRepresentation and coding of signal geometryTime for dithering: fast and quantized random embeddings via the restricted isometry propertyLiterature survey on low rank approximation of matricesOn variants of the Johnson–Lindenstrauss lemmaAcceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemmaLinear regression with sparsely permuted dataA Uniform Lower Error Bound for Half-Space LearningEVD dualdating based online subspace learningStructure from Randomness in Halfspace Learning with the Zero-One LossAn Introduction to Compressed SensingClassification Scheme for Binary Data with ExtensionsNear-optimal encoding for sigma-delta quantization of finite frame expansionsA central limit theorem for convex setsTargeted Random Projection for Prediction From High-Dimensional FeaturesStatistical mechanics of complex neural systems and high dimensional dataCompressive sensing using chaotic sequence based on Chebyshev mapUsing projection-based clustering to find distance- and density-based clusters in high-dimensional dataOn Geometric Prototype and ApplicationsBlessing of dimensionality: mathematical foundations of the statistical physics of dataEuclidean distance between Haar orthogonal and Gaussian matricesOn using Toeplitz and circulant matrices for Johnson-Lindenstrauss transformsHigh-dimensional approximate \(r\)-netsOn orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problemsDimensionality reduction with subgaussian matrices: a unified theoryFrames as CodesConvergence rates of learning algorithms by random projectionRandomized large distortion dimension reductionFast, linear time, \(m\)-adic hierarchical clustering for search and retrieval using the Baire metric, with linkages to generalized ultrametrics, hashing, formal concept analysis, and precision of data measurementTwo-dimensional random projectionApplications of dimensionality reduction and exponential sums to graph automorphismFast directional algorithms for the Helmholtz kernelSwarm intelligence for self-organized clusteringStatistical methods for tissue array images -- algorithmic scoring and co-trainingUnnamed ItemRandom Projection RBF Nets for Multidimensional Density EstimationDimension Reduction for Polynomials over Gaussian Space and ApplicationsThe Bottleneck Degree of Algebraic VarietiesA note on linear function approximation using random projectionsOn Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss TransformsSimple Classification using Binary DataOptimal Bounds for Johnson-Lindenstrauss TransformationsDe-noising by thresholding operator adapted waveletsAlmost Optimal Explicit Johnson-Lindenstrauss FamiliesBayesian random projection-based signal detection for Gaussian scale space random fieldsA Computationally Efficient Projection-Based Approach for Spatial Generalized Linear Mixed ModelsOptimal fast Johnson-Lindenstrauss embeddings for large data setsJohnson-Lindenstrauss lemma for circulant matrices**Dimensionality reduction for \(k\)-distance applied to persistent homologySparse Learning for Large-Scale and High-Dimensional Data: A Randomized Convex-Concave Optimization ApproachRandom projections of smooth manifoldsHow Accurately Should I Compute Implicit Matrix-Vector Products When Applying the Hutchinson Trace Estimator?A Well-Tempered Landscape for Non-convex Robust Subspace RecoveryThe complexity of quantum disjointnessHigh-dimensional model recovery from random sketched data by exploring intrinsic sparsityUnnamed ItemA Powerful Bayesian Test for Equality of Means in High DimensionsFast and memory-optimal dimension reduction using Kac's walkOn the efficiency of Hamiltonian-based quantum computation for low-rank matricesUnnamed ItemUnnamed ItemUnnamed ItemVariance reduction in feature hashing using MLE and control variate methodRobust Width: A Characterization of Uniformly Stable and Robust Compressed SensingEfficient algorithms for privately releasing marginals via convex relaxationsRandom projections and Hotelling’s T2 statistics for change detection in high-dimensional data streamsRandom projections as regularizers: learning a linear discriminant from fewer observations than dimensionsTwo Models of Double Descent for Weak FeaturesSparse control of alignment models in high dimension




Cites Work




This page was built for publication: An elementary proof of a theorem of Johnson and Lindenstrauss