Recovering Trees with Convex Clustering
From MaRDI portal
Publication:5025783
DOI10.1137/18M121099XzbMath1482.68194arXiv1806.11096OpenAlexW2963291751WikidataQ127541041 ScholiaQ127541041MaRDI QIDQ5025783
Stefan Steinerberger, Eric C. Chi
Publication date: 3 February 2022
Published in: SIAM Journal on Mathematics of Data Science (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1806.11096
Ridge regression; shrinkage estimators (Lasso) (62J07) Classification and discrimination; cluster analysis (statistical aspects) (62H30) Convex programming (90C25) Learning and adaptive systems in artificial intelligence (68T05)
Related Items
Unnamed Item ⋮ Biconvex Clustering ⋮ Clustering multivariate count data via Dirichlet-multinomial network fusion ⋮ A novel convex clustering method for high-dimensional data using semiproximal ADMM ⋮ Unnamed Item ⋮ A Dimension Reduction Technique for Large-Scale Structured Sparse Optimization Problems with Application to Convex Clustering
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Solution path clustering with adaptive concave penalty
- Statistical properties of convex clustering
- One-step sparse estimates in nonconcave penalized likelihood models
- Sparse regression with exact clustering
- Majorization-minimization algorithms for nonsmoothly penalized objective functions
- Hierarchical clustering schemes
- Diffusion maps
- Sparse Convex Clustering
- A Survey of Recent Advances in Hierarchical Clustering Algorithms
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Convex Clustering via l 1 Fusion Penalization
- Sparsity and Smoothness Via the Fused Lasso
- Laplacian Eigenmaps for Dimensionality Reduction and Data Representation
- Convex biclustering
- Homogeneity Pursuit
- Model Selection and Estimation in Regression with Grouped Variables
- Simultaneous Regression Shrinkage, Variable Selection, and Supervised Clustering of Predictors with OSCAR
- Hessian eigenmaps: Locally linear embedding techniques for high-dimensional data