How to find a good explanation for clustering?
From MaRDI portal
Publication:6136087
DOI10.1016/J.ARTINT.2023.103948arXiv2112.06580OpenAlexW4379055191MaRDI QIDQ6136087FDOQ6136087
Nidhi Purohit, William Lochet, Kirill Simonov, Fedor V. Fomin, Sayan Bandyapadhyay, Petr A. Golovach
Publication date: 28 August 2023
Published in: Artificial Intelligence (Search for Journal in Brave)
Abstract: -means and -median clustering are powerful unsupervised machine learning techniques. However, due to complicated dependences on all the features, it is challenging to interpret the resulting cluster assignments. Moshkovitz, Dasgupta, Rashtchian, and Frost [ICML 2020] proposed an elegant model of explainable -means and -median clustering. In this model, a decision tree with leaves provides a straightforward characterization of the data set into clusters. We study two natural algorithmic questions about explainable clustering. (1) For a given clustering, how to find the "best explanation" by using a decision tree with leaves? (2) For a given set of points, how to find a decision tree with leaves minimizing the -means/median objective of the resulting explainable clustering? To address the first question, we introduce a new model of explainable clustering. Our model, inspired by the notion of outliers in robust statistics, is the following. We are seeking a small number of points (outliers) whose removal makes the existing clustering well-explainable. For addressing the second question, we initiate the study of the model of Moshkovitz et al. from the perspective of multivariate complexity. Our rigorous algorithmic analysis sheds some light on the influence of parameters like the input size, dimension of the data, the number of outliers, the number of clusters, and the approximation ratio, on the computational complexity of explainable clustering.
Full work available at URL: https://arxiv.org/abs/2112.06580
Cites Work
- Random forests
- NP-hardness of Euclidean sum-of-squares clustering
- Generalized additive models
- Fundamentals of parameterized complexity
- Supersparse linear integer models for optimized medical scoring systems
- Which problems have strongly exponential complexity?
- Algorithms for facility location problems with outliers. (Extended abstract)
- Parameterized Algorithms
- On the complexity of \(k\)-SAT
- Clustering large graphs via the singular value decomposition
- The planar \(k\)-means problem is NP-hard
- Title not available (Why is that?)
- Linear-time approximation schemes for clustering problems in any dimensions
- Interpretable clustering using unsupervised binary trees
- On the Parameterized Complexity of Approximating Dominating Set
- On Tackling Explanation Redundancy in Decision Trees
- Kernelization
- Title not available (Why is that?)
- Approximation Schemes for Clustering with Outliers
- Title not available (Why is that?)
- Constant approximation for k-median and k-means with outliers via iterative rounding
- Interpretable clustering: an optimization approach
- Definitions, methods, and applications in interpretable machine learning
- A Lottery Model for Center-Type Problems With Outliers
Cited In (1)
This page was built for publication: How to find a good explanation for clustering?
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6136087)