Pages that link to "Item:Q1401965"
From MaRDI portal
The following pages link to Database-friendly random projections: Johnson-Lindenstrauss with binary coins. (Q1401965):
Displayed 50 items.
- Dimension reduction and construction of feature space for image pattern recognition (Q294416) (← links)
- Derandomizing restricted isometries via the Legendre symbol (Q295815) (← links)
- Simple bounds for recovering low-complexity models (Q378116) (← links)
- Randomized projective methods for the construction of binary sparse vector representations (Q380676) (← links)
- Sparsified randomization algorithms for low rank approximations and applications to integral equations and inhomogeneous random field simulation (Q413911) (← links)
- Algorithmic paradigms for stability-based cluster validity and model selection statistical methods, with applications to microarray data analysis (Q418747) (← links)
- Streaming techniques and data aggregation in networks of tiny artefacts (Q465668) (← links)
- Toward a unified theory of sparse dimensionality reduction in Euclidean space (Q496171) (← links)
- Real-valued embeddings and sketches for fast distance and similarity estimation (Q508585) (← links)
- Dimensionality reduction with subgaussian matrices: a unified theory (Q515989) (← links)
- Two-dimensional random projection (Q537260) (← links)
- A variant of the Johnson-Lindenstrauss lemma for circulant matrices (Q629700) (← links)
- Dense fast random projections and Lean Walsh transforms (Q629831) (← links)
- Limiting laws of coherence of random matrices with applications to testing covariance structure and construction of compressed sensing matrices (Q638800) (← links)
- Acceleration of randomized Kaczmarz method via the Johnson-Lindenstrauss lemma (Q639988) (← links)
- Randomized interpolative decomposition of separated representations (Q728743) (← links)
- R3P-Loc: a compact multi-label predictor using ridge regression and random projection for protein subcellular localization (Q739656) (← links)
- High-dimensional model recovery from random sketched data by exploring intrinsic sparsity (Q782446) (← links)
- Kernels as features: on kernels, margins, and low-dimensional mappings (Q851869) (← links)
- Estimates on compressed neural networks regression (Q889370) (← links)
- Vector data transformation using random binary matrices (Q891113) (← links)
- A performance driven methodology for cancelable face templates generation (Q969104) (← links)
- The Mailman algorithm: a note on matrix-vector multiplication (Q976066) (← links)
- On principal components regression, random projections, and column subsampling (Q1616329) (← links)
- Gaussian random projections for Euclidean membership problems (Q1634768) (← links)
- Efficient clustering on Riemannian manifolds: a kernelised random projection approach (Q1669725) (← links)
- Robustness properties of dimensionality reduction with Gaussian random matrices (Q1703873) (← links)
- Forecasting using random subspace methods (Q1740303) (← links)
- Bayesian compressed vector autoregressions (Q1740345) (← links)
- Fuzzy \(c\)-means and cluster ensemble with random projection for big data clustering (Q1793461) (← links)
- Efficient extreme learning machine via very sparse random projection (Q1797950) (← links)
- On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms (Q1986965) (← links)
- On orthogonal projections for dimension reduction and applications in augmented target loss functions for learning problems (Q1988352) (← links)
- Efficient large scale global optimization through clustering-based population methods (Q2027033) (← links)
- Geometric component analysis and its applications to data analysis (Q2036489) (← links)
- Correlations between random projections and the bivariate normal (Q2036781) (← links)
- Stochastic quasi-gradient methods: variance reduction via Jacobian sketching (Q2039235) (← links)
- Random projections for conic programs (Q2040548) (← links)
- A stochastic subspace approach to gradient-free optimization in high dimensions (Q2044475) (← links)
- Bayesian random projection-based signal detection for Gaussian scale space random fields (Q2058557) (← links)
- Optimal fast Johnson-Lindenstrauss embeddings for large data sets (Q2059797) (← links)
- Dimensionality reduction for \(k\)-distance applied to persistent homology (Q2063202) (← links)
- Fast and memory-optimal dimension reduction using Kac's walk (Q2090615) (← links)
- Variance reduction in feature hashing using MLE and control variate method (Q2102328) (← links)
- Recent advances in text-to-pattern distance algorithms (Q2106622) (← links)
- Near-neighbor preserving dimension reduction via coverings for doubling subsets of \(\ell_1\) (Q2110372) (← links)
- Random-walk based approximate \(k\)-nearest neighbors algorithm for diffusion state distance (Q2128423) (← links)
- High-dimensional clustering via random projections (Q2129311) (← links)
- Randomized approaches to accelerate MCMC algorithms for Bayesian inverse problems (Q2129320) (← links)
- Side-constrained minimum sum-of-squares clustering: mathematical programming and random projections (Q2131141) (← links)