Pages that link to "Item:Q4798181"
From MaRDI portal
The following pages link to An elementary proof of a theorem of Johnson and Lindenstrauss (Q4798181):
Displayed 50 items.
- Swarm intelligence for self-organized clustering (Q86925) (← links)
- Sparse control of alignment models in high dimension (Q258529) (← links)
- Sampling quantum nonlocal correlations with high probability (Q293045) (← links)
- A unified framework for linear dimensionality reduction in L1 (Q310869) (← links)
- Algorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysis (Q418747) (← links)
- Near-optimal encoding for sigma-delta quantization of finite frame expansions (Q485203) (← links)
- Compressive sensing using chaotic sequence based on Chebyshev map (Q494825) (← links)
- Dimensionality reduction with subgaussian matrices: a unified theory (Q515989) (← links)
- Two-dimensional random projection (Q537260) (← links)
- Applications of dimensionality reduction and exponential sums to graph automorphism (Q551192) (← links)
- On the distance concentration awareness of certain data reduction techniques (Q614077) (← links)
- Randomized anisotropic transform for nonlinear dimensionality reduction (Q623745) (← links)
- A variant of the Johnson-Lindenstrauss lemma for circulant matrices (Q629700) (← links)
- Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma (Q639988) (← links)
- Statistical methods for tissue array images -- algorithmic scoring and co-training (Q714381) (← links)
- High-dimensional model recovery from random sketched data by exploring intrinsic sparsity (Q782446) (← links)
- Fast directional algorithms for the Helmholtz kernel (Q975663) (← links)
- Random projections of smooth manifolds (Q1029551) (← links)
- Exponential random graphs behave like mixtures of stochastic block models (Q1634185) (← links)
- Gaussian random projections for Euclidean membership problems (Q1634768) (← links)
- On the structured backward error of inexact Arnoldi methods for (skew)-Hermitian and (skew)-symmetric eigenvalue problems (Q1689317) (← links)
- Linear regression with sparsely permuted data (Q1711600) (← links)
- EVD dualdating based online subspace learning (Q1718370) (← links)
- Euclidean distance between Haar orthogonal and Gaussian matrices (Q1745260) (← links)
- Fast, linear time, \(m\)-adic hierarchical clustering for search and retrieval using the Baire metric, with linkages to generalized ultrametrics, hashing, formal concept analysis, and precision of data measurement (Q1760308) (← links)
- On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms (Q1986965) (← links)
- High-dimensional approximate \(r\)-nets (Q1987244) (← links)
- On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems (Q1988352) (← links)
- Bayesian random projection-based signal detection for Gaussian scale space random fields (Q2058557) (← links)
- Optimal fast Johnson-Lindenstrauss embeddings for large data sets (Q2059797) (← links)
- Dimensionality reduction for \(k\)-distance applied to persistent homology (Q2063202) (← links)
- Fast and memory-optimal dimension reduction using Kac's walk (Q2090615) (← links)
- Variance reduction in feature hashing using MLE and control variate method (Q2102328) (← links)
- Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections (Q2131141) (← links)
- Dimension reduction in vertex-weighted exponential random graphs (Q2143327) (← links)
- Numerical bifurcation analysis of PDEs from lattice Boltzmann model simulations: a parsimonious machine learning approach (Q2149520) (← links)
- Guarantees for the Kronecker fast Johnson-Lindenstrauss transform using a coherence and sampling argument (Q2185843) (← links)
- A simple test for zero multiple correlation coefficient in high-dimensional normal data using random projection (Q2189584) (← links)
- Distance geometry and data science (Q2192022) (← links)
- Data-independent random projections from the feature-map of the homogeneous polynomial kernel of degree two (Q2195436) (← links)
- Using projection-based clustering to find distance- and density-based clusters in high-dimensional data (Q2236772) (← links)
- Convergence rates of learning algorithms by random projection (Q2252501) (← links)
- Randomized large distortion dimension reduction (Q2253922) (← links)
- De-noising by thresholding operator adapted wavelets (Q2302448) (← links)
- Efficient algorithms for privately releasing marginals via convex relaxations (Q2349860) (← links)
- Random projections as regularizers: learning a linear discriminant from fewer observations than dimensions (Q2353006) (← links)
- Far-field compression for fast kernel summation methods in high dimensions (Q2397164) (← links)
- Rigorous RG algorithms and area laws for low energy eigenstates in 1D (Q2412377) (← links)
- A central limit theorem for convex sets (Q2457766) (← links)
- A note on linear function approximation using random projections (Q2519761) (← links)