Certifying global optimality of graph cuts via semidefinite relaxation: a performance guarantee for spectral clustering
DOI10.1007/S10208-019-09421-3zbMATH Open1445.90109arXiv1806.11429OpenAlexW2963057284MaRDI QIDQ2189394FDOQ2189394
Authors: Shuyang Ling, Thomas Strohmer
Publication date: 15 June 2020
Published in: Foundations of Computational Mathematics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1806.11429
Recommendations
graph partitioncommunity detectionunsupervised learningsemidefinite programmingspectral clusteringgraph Laplacian
Optimality conditions and duality in mathematical programming (90C46) Random matrices (probabilistic aspects) (60B20) Combinatorial optimization (90C27) Semi-infinite programming (90C34)
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Spectral clustering and the high-dimensional stochastic blockmodel
- Diffusion maps
- Least squares quantization in PCM
- The Rotation of Eigenvectors by a Perturbation. III
- Title not available (Why is that?)
- Title not available (Why is that?)
- NP-hardness of Euclidean sum-of-squares clustering
- Consistency of spectral clustering in stochastic block models
- Community detection and stochastic block models: recent developments
- Consistency of spectral clustering
- Lectures on modern convex optimization. Analysis, algorithms, and engineering applications
- Title not available (Why is that?)
- On semidefinite relaxations for the block model
- Exact Recovery in the Stochastic Block Model
- User-friendly tail bounds for sums of random matrices
- Spectra of graphs
- Title not available (Why is that?)
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- From graph to manifold Laplacian: the convergence rate
- Graph theory with applications
- Geometric diffusions as a tool for harmonic analysis and structure definition of data: diffusion maps
- Expander flows, geometric embeddings and graph partitioning
- Improved spectral-norm bounds for clustering
- The Planar k-Means Problem is NP-Hard
- Learning Theory
- Title not available (Why is that?)
- Approximating K‐means‐type Clustering via Semidefinite Programming
- Random Laplacian matrices and convex relaxations
- Markov chains and mixing times. With a chapter on ``Coupling from the past by James G. Propp and David B. Wilson.
- Spectral convergence of the connection Laplacian from random samples
- A variational approach to the consistency of spectral clustering
- Probably certifiably correct \(k\)-means clustering
- Relax, no need to round: integrality of clustering formulations
- When do birds of a feather flock together? \(k\)-means, proximity, and conic programming
- Clustering subgaussian mixtures by semidefinite programming
- Clustering is semidefinitely not that hard: nonnegative SDP for manifold disentangling
Cited In (8)
- The ratio-cut polytope and K-means clustering
- When do birds of a feather flock together? \(k\)-means, proximity, and conic programming
- A performance guarantee for spectral clustering
- Title not available (Why is that?)
- Optimality of spectral clustering in the Gaussian mixture model
- Title not available (Why is that?)
- Learning by unsupervised nonlinear diffusion
- \(k\)-median: exact recovery in the extended stochastic ball model
Uses Software
This page was built for publication: Certifying global optimality of graph cuts via semidefinite relaxation: a performance guarantee for spectral clustering
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2189394)