On variants of the Johnson–Lindenstrauss lemma
From MaRDI portal
Publication:3522833
DOI10.1002/rsa.20218zbMath1154.51002WikidataQ124888350 ScholiaQ124888350MaRDI QIDQ3522833
Publication date: 4 September 2008
Published in: Random Structures and Algorithms (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1002/rsa.20218
dimension reduction; moment generating function; low-distortion embeddings; Johnson-Lindenstrauss Lemma; subgaussian tail
68Q25: Analysis of algorithms and problem complexity
68W40: Analysis of algorithms
51K05: General theory of distance geometry
51M05: Euclidean geometries (general) and generalizations
Related Items
Unnamed Item, Optimal Bounds for Johnson-Lindenstrauss Transformations, Unnamed Item, A survey on unsupervised outlier detection in high‐dimensional numerical data, Sampled Tikhonov regularization for large linear inverse problems, Why Are Big Data Matrices Approximately Low Rank?, Structure from Randomness in Halfspace Learning with the Zero-One Loss, Random Projections for Linear Programming, Unnamed Item, Neural network approximation of continuous functions in high dimensions with applications to inverse problems, Randomized algorithms for the computation of multilinear rank-\((\mu_1,\mu_2,\mu_3)\) approximations, Modewise operators, the tensor restricted isometry property, and low-rank tensor recovery, Dimension reduction and construction of feature space for image pattern recognition, Indexability, concentration, and VC theory, Convexity of the image of a quadratic map via the relative entropy distance, Toward a unified theory of sparse dimensionality reduction in Euclidean space, Real-valued embeddings and sketches for fast distance and similarity estimation, Dimensionality reduction with subgaussian matrices: a unified theory, Faster least squares approximation, A variant of the Johnson-Lindenstrauss lemma for circulant matrices, Dense fast random projections and Lean Walsh transforms, Fast dimension reduction using Rademacher series on dual BCH codes, On principal components regression, random projections, and column subsampling, Gaussian random projections for Euclidean membership problems, On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems, Dimensionality reduction for \(k\)-distance applied to persistent homology, On the perceptron's compression, Distance geometry and data science, Linear dimension reduction approximately preserving a function of the $1$-norm, A partitioned quasi-likelihood for distributed statistical inference, Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions, Binary vectors for fast distance and similarity estimation, Random projections of linear and semidefinite problems with linear inequalities, Hypercontractivity via tensor calculus, Random projections and Hotelling’s T2 statistics for change detection in high-dimensional data streams, Almost Optimal Explicit Johnson-Lindenstrauss Families, Johnson-Lindenstrauss lemma for circulant matrices**, A Survey of Compressed Sensing, Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles
Cites Work
- Unnamed Item
- The Johnson-Lindenstrauss lemma and the sphericity of some graphs
- Database-friendly random projections: Johnson-Lindenstrauss with binary coins.
- Problems and results in extremal combinatorics. I.
- Empirical processes and random projections
- Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform
- Extensions of Lipschitz mappings into a Hilbert space
- On the impossibility of dimension reduction in l 1
- Projection constants of symmetric spaces and variants of Khintchine's inequality
- An elementary proof of a theorem of Johnson and Lindenstrauss