Pages that link to "Item:Q3522833"
From MaRDI portal
The following pages link to On variants of the Johnson–Lindenstrauss lemma (Q3522833):
Displaying 41 items.
- Dimension reduction and construction of feature space for image pattern recognition (Q294416) (← links)
- Indexability, concentration, and VC theory (Q450514) (← links)
- Convexity of the image of a quadratic map via the relative entropy distance (Q464812) (← links)
- Toward a unified theory of sparse dimensionality reduction in Euclidean space (Q496171) (← links)
- Real-valued embeddings and sketches for fast distance and similarity estimation (Q508585) (← links)
- Dimensionality reduction with subgaussian matrices: a unified theory (Q515989) (← links)
- Faster least squares approximation (Q623334) (← links)
- A variant of the Johnson-Lindenstrauss lemma for circulant matrices (Q629700) (← links)
- Dense fast random projections and Lean Walsh transforms (Q629831) (← links)
- Fast dimension reduction using Rademacher series on dual BCH codes (Q1042451) (← links)
- On principal components regression, random projections, and column subsampling (Q1616329) (← links)
- Gaussian random projections for Euclidean membership problems (Q1634768) (← links)
- On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems (Q1988352) (← links)
- Dimensionality reduction for \(k\)-distance applied to persistent homology (Q2063202) (← links)
- On the perceptron's compression (Q2106618) (← links)
- Distance geometry and data science (Q2192022) (← links)
- Linear dimension reduction approximately preserving a function of the $1$-norm (Q2219215) (← links)
- A partitioned quasi-likelihood for distributed statistical inference (Q2228213) (← links)
- Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions (Q2353006) (← links)
- Binary vectors for fast distance and similarity estimation (Q2362826) (← links)
- Random projections of linear and semidefinite problems with linear inequalities (Q2689146) (← links)
- Hypercontractivity via tensor calculus (Q2695312) (← links)
- Random projections and Hotelling’s T2 statistics for change detection in high-dimensional data streams (Q2872879) (← links)
- Almost Optimal Explicit Johnson-Lindenstrauss Families (Q3088132) (← links)
- Johnson-Lindenstrauss lemma for circulant matrices** (Q3094609) (← links)
- A Survey of Compressed Sensing (Q3460827) (← links)
- Fast Parallel Estimation of High Dimensional Information Theoretical Quantities with Low Dimensional Random Projection Ensembles (Q3614946) (← links)
- (Q4558470) (← links)
- Optimal Bounds for Johnson-Lindenstrauss Transformations (Q4614115) (← links)
- A survey on unsupervised outlier detection in high‐dimensional numerical data (Q4969851) (← links)
- Sampled Tikhonov regularization for large linear inverse problems (Q4973539) (← links)
- Why Are Big Data Matrices Approximately Low Rank? (Q5025778) (← links)
- (Q5115805) (← links)
- Structure from Randomness in Halfspace Learning with the Zero-One Loss (Q5139592) (← links)
- Random Projections for Linear Programming (Q5219688) (← links)
- (Q5743469) (← links)
- Neural network approximation of continuous functions in high dimensions with applications to inverse problems (Q6056231) (← links)
- Towards Practical Large-Scale Randomized Iterative Least Squares Solvers through Uncertainty Quantification (Q6062237) (← links)
- Randomized algorithms for the computation of multilinear rank-\((\mu_1,\mu_2,\mu_3)\) approximations (Q6064026) (← links)
- \( \varepsilon \)-isometric dimension reduction for incompressible subsets of \(\ell_p\) (Q6145674) (← links)
- Modewise operators, the tensor restricted isometry property, and low-rank tensor recovery (Q6172173) (← links)