On variants of the Johnson–Lindenstrauss lemma

From MaRDI portal
Publication:3522833

DOI10.1002/rsa.20218zbMath1154.51002OpenAlexW3147249397WikidataQ124888350 ScholiaQ124888350MaRDI QIDQ3522833

Ji{ří} Matoušek

Publication date: 4 September 2008

Published in: Random Structures and Algorithms (Search for Journal in Brave)

Full work available at URL: https://doi.org/10.1002/rsa.20218




Related Items (max. 100)

On principal components regression, random projections, and column subsamplingBinary vectors for fast distance and similarity estimationDimension reduction and construction of feature space for image pattern recognitionGaussian random projections for Euclidean membership problemsA Survey of Compressed SensingNeural network approximation of continuous functions in high dimensions with applications to inverse problemsTowards Practical Large-Scale Randomized Iterative Least Squares Solvers through Uncertainty QuantificationRandomized algorithms for the computation of multilinear rank-\((\mu_1,\mu_2,\mu_3)\) approximationsDistance geometry and data scienceFaster least squares approximation\( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\)A variant of the Johnson-Lindenstrauss lemma for circulant matricesModewise operators, the tensor restricted isometry property, and low-rank tensor recoveryDense fast random projections and Lean Walsh transformsRandom projections of linear and semidefinite problems with linear inequalitiesHypercontractivity via tensor calculusIndexability, concentration, and VC theoryLinear dimension reduction approximately preserving a function of the $1$-normConvexity of the image of a quadratic map via the relative entropy distanceStructure from Randomness in Halfspace Learning with the Zero-One LossA partitioned quasi-likelihood for distributed statistical inferenceToward a unified theory of sparse dimensionality reduction in Euclidean spaceReal-valued embeddings and sketches for fast distance and similarity estimationOn orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problemsDimensionality reduction with subgaussian matrices: a unified theoryUnnamed ItemFast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection EnsemblesUnnamed ItemOptimal Bounds for Johnson-Lindenstrauss TransformationsA survey on unsupervised outlier detection in high‐dimensional numerical dataRandom Projections for Linear ProgrammingAlmost Optimal Explicit Johnson-Lindenstrauss FamiliesSampled Tikhonov regularization for large linear inverse problemsJohnson-Lindenstrauss lemma for circulant matrices**Dimensionality reduction for \(k\)-distance applied to persistent homologyFast dimension reduction using Rademacher series on dual BCH codesUnnamed ItemOn the perceptron's compressionRandom projections and Hotelling’s T2 statistics for change detection in high-dimensional data streamsWhy Are Big Data Matrices Approximately Low Rank?Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions



Cites Work


This page was built for publication: On variants of the Johnson–Lindenstrauss lemma