Scalable kernel k-means clustering with Nyström approximation: relative-error bounds
From MaRDI portal
Publication:4633019
Recommendations
- Fast spectral clustering via the Nyström method
- Approximating spectral clustering via sampling: a review
- Fast kernel \(k\)-means clustering using incomplete Cholesky factorization
- On the Nyström method for approximating a gram matrix for improved kernel-based learning
- Dimensionality reduction for \(k\)-means clustering and low rank approximation
Cites work
- scientific article; zbMATH DE number 6381735 (Why is no real title available?)
- scientific article; zbMATH DE number 2062634 (Why is no real title available?)
- scientific article; zbMATH DE number 910879 (Why is no real title available?)
- scientific article; zbMATH DE number 6765491 (Why is no real title available?)
- scientific article; zbMATH DE number 6781341 (Why is no real title available?)
- scientific article; zbMATH DE number 6276143 (Why is no real title available?)
- scientific article; zbMATH DE number 3337135 (Why is no real title available?)
- scientific article; zbMATH DE number 3417498 (Why is no real title available?)
- 10.1162/153244303321897735
- A local search approximation algorithm for \(k\)-means clustering
- Algorithms and Computation
- An improved approximation algorithm for the column subset selection problem
- Compression of motion capture databases
- Dimensionality reduction for \(k\)-means clustering and low rank approximation
- Extensions of Lipschitz mappings into a Hilbert space
- Fast Monte Carlo Algorithms for Matrices I: Approximating Matrix Multiplication
- Fast approximation of matrix coherence and statistical leverage
- Faster least squares approximation
- Finding structure with randomness: probabilistic algorithms for constructing approximate matrix decompositions
- Improved Bounds for the Nyström Method With Application to Kernel Classification
- Improved analysis of the subsampled randomized Hadamard transform
- Improving CUR matrix decomposition and the Nyström approximation via adaptive sampling
- Input sparsity time low-rank approximation via ridge leverage score sampling
- Least squares quantization in PCM
- Linear-time approximation schemes for clustering problems in any dimensions
- Low-distortion subspace embeddings in input-sparsity time and applications to robust linear regression
- Lower Bounds for the Partitioning of Graphs
- MLlib: machine learning in Apache Spark
- NP-hardness of Euclidean sum-of-squares clustering
- Near-optimal column-based matrix reconstruction
- On Coresets for k-Median and k-Means Clustering in Metric and Euclidean Spaces and Their Applications
- On approximate geometric \(k\)-clustering
- On coresets for k-means and k-median clustering
- On the Nyström method for approximating a gram matrix for improved kernel-based learning
- Practical sketching algorithms for low-rank matrix approximation
- Random Projection Trees for Vector Quantization
- Random matrices: The distribution of the smallest singular values
- Randomized Algorithms for Matrices and Data
- Randomized Dimensionality Reduction for <inline-formula> <tex-math notation="LaTeX">$k$ </tex-math></inline-formula>-Means Clustering
- Relative-Error $CUR$ Matrix Decompositions
- Revisiting the Nyström method for improved large-scale machine learning
- SPSD matrix approximation vis column selection: theories, algorithms, and extensions
- Semi-supervised clustering
- Sketched ridge regression: optimization perspective, statistical perspective, and model averaging
- Sketching as a tool for numerical linear algebra
- Spectral partitioning works: planar graphs and finite element meshes
- Sublinear randomized algorithms for skeleton decompositions
- The Littlewood-Offord problem and invertibility of random matrices
- The Planar k-Means Problem is NP-Hard
- The complexity of the generalized Lloyd - Max problem (Corresp.)
- The hardness of approximation of Euclidean \(k\)-means
- Towards more efficient SPSD matrix approximation and CUR matrix decomposition
- Turning big data into tiny data: constant-size coresets for \(k\)-means, PCA and projective clustering
Cited in
(18)- Perturbations of CUR Decompositions
- Energy-based sequential sampling for low-rank PSD-matrix approximation
- scientific article; zbMATH DE number 7758314 (Why is no real title available?)
- A new robust fuzzy clustering validity index for imbalanced data sets
- Breaking the curse of dimensionality: hierarchical Bayesian network model for multi-view clustering
- Diversity sampling is an implicit regularization for kernel methods
- Coresets for kernel clustering
- Constrained clustering and multiple kernel learning without pairwise constraint relaxation
- scientific article; zbMATH DE number 7625159 (Why is no real title available?)
- A feasible \(k\)-means kernel trick under non-Euclidean feature space
- Large-scale non-negative subspace clustering based on Nyström approximation
- Approximating spectral clustering via sampling: a review
- Randomized numerical linear algebra: Foundations and algorithms
- Core-elements for large-scale least squares estimation
- Fast kernel \(k\)-means clustering using incomplete Cholesky factorization
- Randomized Low-Rank Approximation for Symmetric Indefinite Matrices
- Fast spectral clustering via the Nyström method
- Randomized Spectral Clustering in Large-Scale Stochastic Block Models
This page was built for publication: Scalable kernel \(k\)-means clustering with Nyström approximation: relative-error bounds
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4633019)