Dimensionality Reduction for k-Means Clustering and Low Rank Approximation

From MaRDI portal
Revision as of 20:15, 3 February 2024 by Import240129110113 (talk | contribs) (Created automatically from import240129110113)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Publication:2941504

DOI10.1145/2746539.2746569zbMath1321.68398arXiv1410.6801OpenAlexW2133157266MaRDI QIDQ2941504

Michael B. Cohen, Christopher Musco, Madalina Persu, Cameron Musco, Sam Elder

Publication date: 21 August 2015

Published in: Proceedings of the forty-seventh annual ACM symposium on Theory of Computing (Search for Journal in Brave)

Full work available at URL: https://arxiv.org/abs/1410.6801




Related Items (31)

Side-constrained minimum sum-of-squares clustering: mathematical programming and random projectionsUnnamed ItemInfinite lattice learner: an ensemble for incremental learningPerformance of Johnson--Lindenstrauss Transform for $k$-Means and $k$-Medians ClusteringUnnamed ItemStructural conditions for projection-cost preservation via randomized matrix multiplicationAn Improved Analysis and Unified Perspective on Deterministic and Randomized Low-Rank Matrix ApproximationA novel method for optimizing spectral rotation embedding \(K\)-means with coordinate descentPractical Sketching Algorithms for Low-Rank Matrix ApproximationOn coresets for fair clustering in metric and Euclidean spaces and their applicationsRandom Projection and Recovery for High Dimensional Optimization with Arbitrary OutliersUnnamed ItemUnnamed ItemA Doubly Enhanced EM Algorithm for Model-Based Tensor ClusteringSketching for Principal Component RegressionParameterized \(k\)-clustering: tractability islandCore-Sets: Updated SurveyApproximating Spectral Clustering via Sampling: A ReviewTurning Big Data Into Tiny Data: Constant-Size Coresets for $k$-Means, PCA, and Projective ClusteringToward a unified theory of sparse dimensionality reduction in Euclidean spaceOn using Toeplitz and circulant matrices for Johnson-Lindenstrauss transformsAn efficient \(K\)-means clustering algorithm for tall dataUnnamed ItemPass-Efficient Randomized Algorithms for Low-Rank Matrix Approximation Using Any Number of ViewsRandom projections for Bayesian regressionOn coresets for support vector machinesOn Using Toeplitz and Circulant Matrices for Johnson-Lindenstrauss TransformsFrequent Directions: Simple and Deterministic Matrix SketchingStreaming Low-Rank Matrix Approximation with an Application to Scientific SimulationUnnamed ItemUnnamed Item


Uses Software


Cites Work




This page was built for publication: Dimensionality Reduction for k-Means Clustering and Low Rank Approximation