Initializing the EM algorithm in Gaussian mixture models with an unknown number of components
DOI10.1016/j.csda.2011.11.002zbMath1246.65025OpenAlexW2078124810MaRDI QIDQ434890
Igor Melnykov, Volodymyr Melnykov
Publication date: 16 July 2012
Published in: Computational Statistics and Data Analysis (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.csda.2011.11.002
convergencenumerical examplescluster analysisexpectation-maximization algorithmtruncated normal distributioninitializationeigenvalue decompositionmultivariate Gaussian mixture models
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (23)
Uses Software
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Model-based classification via mixtures of multivariate \(t\)-distributions
- Choosing starting values for the EM algorithm for getting the highest likelihood in multivariate Gaussian mixture models
- Choosing initial values for the EM algorithm for finite mixtures
- Two-way Poisson mixture models for simultaneous document classification and word clustering
- Finite mixture models and model-based clustering
- Variational approximations in Bayesian model selection for finite mixture distributions
- High-dimensional data clustering
- Estimating the dimension of a model
- Maximum likelihood estimation for multivariate skew normal mixture models
- Finding Groups in Data
- Discrete Parameter Variation: Efficient Estimation of a Switching Regression Model
- Algorithms for Model-Based Gaussian Hierarchical Clustering
- A Statistical Model for Positron Emission Tomography
- Convergence properties of a general algorithm for calculating variational Bayesian estimates for a normal mixture model
This page was built for publication: Initializing the EM algorithm in Gaussian mixture models with an unknown number of components