Fast dimension reduction using Rademacher series on dual BCH codes
From MaRDI portal
Publication:1042451
DOI10.1007/s00454-008-9110-xzbMath1180.94083OpenAlexW2124659530MaRDI QIDQ1042451
Publication date: 14 December 2009
Published in: Discrete \& Computational Geometry (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1007/s00454-008-9110-x
Local theory of Banach spaces (46B07) Application of orthogonal and other special functions (94A11) Burst-correcting codes (94B20)
Related Items
Tighter Fourier Transform Lower Bounds ⋮ Improved bounds for sparse recovery from subsampled random convolutions ⋮ A stability result using the matrix norm to bound the permanent ⋮ Performance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians Clustering ⋮ Sparser Johnson-Lindenstrauss Transforms ⋮ RidgeSketch: A Fast Sketching Based Solver for Large Scale Ridge Regression ⋮ Faster Kernel Ridge Regression Using Sketching and Preconditioning ⋮ Distributed learning for sketched kernel regression ⋮ On fast Johnson-Lindenstrauss embeddings of compact submanifolds of \(\mathbb{R}^N\) with boundary ⋮ Fast Metric Embedding into the Hamming Cube ⋮ A variant of the Johnson-Lindenstrauss lemma for circulant matrices ⋮ Dense fast random projections and Lean Walsh transforms ⋮ Newton Sketch: A Near Linear-Time Optimization Algorithm with Linear-Quadratic Convergence ⋮ On deterministic sketching and streaming for sparse recovery and norm estimation ⋮ Unnamed Item ⋮ Simple Analyses of the Sparse Johnson-Lindenstrauss Transform. ⋮ Toward a unified theory of sparse dimensionality reduction in Euclidean space ⋮ On using Toeplitz and circulant matrices for Johnson-Lindenstrauss transforms ⋮ Real-valued embeddings and sketches for fast distance and similarity estimation ⋮ Fast and RIP-optimal transforms ⋮ Random projections for Bayesian regression ⋮ On Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss Transforms ⋮ Randomized linear algebra for model reduction. I. Galerkin methods and error estimation ⋮ Optimal fast Johnson-Lindenstrauss embeddings for large data sets ⋮ Compressed dictionary learning ⋮ Fast and memory-optimal dimension reduction using Kac's walk ⋮ Estimating Leverage Scores via Rank Revealing Methods and Randomization ⋮ Unnamed Item ⋮ Unnamed Item ⋮ The Restricted Isometry Property of Subsampled Fourier Matrices ⋮ Unnamed Item ⋮ Randomized Gram--Schmidt Process with Application to GMRES
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Faster least squares approximation
- A fast randomized algorithm for the approximation of matrices
- The Johnson-Lindenstrauss lemma and the sphericity of some graphs
- Problems and results in extremal combinatorics. I.
- An algorithmic theory of learning: Robust concepts and random projection
- Approximate nearest neighbors and the fast Johnson-Lindenstrauss transform
- Extensions of Lipschitz mappings into a Hilbert space
- On variants of the Johnson–Lindenstrauss lemma
- Projection constants of symmetric spaces and variants of Khintchine's inequality
- Efficient Search for Approximate Nearest Neighbor in High Dimensional Spaces
- Fast monte-carlo algorithms for finding low-rank approximations
- Fast Monte Carlo Algorithms for Matrices II: Computing a Low-Rank Approximation to a Matrix
- Fast Monte Carlo Algorithms for Matrices III: Computing a Compressed Approximate Matrix Decomposition
- Lower bounds for linear degeneracy testing