Singularity, misspecification and the convergence rate of EM
DOI10.1214/19-AOS1924zbMATH Open1462.62382arXiv1810.00828OpenAlexW3113313164MaRDI QIDQ1996764FDOQ1996764
Authors: Raaz Dwivedi, Nhat Ho, Koulik Khamaru, Martin J. Wainwright, Michael Jordan, Bin Yu
Publication date: 26 February 2021
Published in: The Annals of Statistics (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1810.00828
Recommendations
- Statistical convergence of the EM algorithm on Gaussian mixture models
- Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in \(O(\sqrt{n})\) iterations
- A probabilistic analysis of EM for mixtures of separated, spherical Gaussians
- Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM
- Statistical guarantees for the EM algorithm: from population to sample-based analysis
mixture modelsempirical processFisher information matrixlocalization argumentexpectation-maximization (EM)nonasymptotic convergence guarantees
Statistical aspects of information-theoretic topics (62B10) Nonparametric estimation (62G05) Asymptotic properties of nonparametric inference (62G20) Classification and discrimination; cluster analysis (statistical aspects) (62H30)
Cites Work
- CHIME: clustering of high-dimensional Gaussian mixtures with EM algorithm and its optimality
- Asymptotic behaviour of the posterior distribution in overfitted mixture models
- Title not available (Why is that?)
- On the convergence properties of the EM algorithm
- Title not available (Why is that?)
- Bayesian Model Selection in Finite Mixtures by Marginal Density Decompositions
- Dealing With Label Switching in Mixture Models
- Entropies and rates of convergence for maximum likelihood and Bayes estimation for mixtures of normal densities.
- Mixture Densities, Maximum Likelihood and the EM Algorithm
- High-dimensional statistics. A non-asymptotic viewpoint
- Title not available (Why is that?)
- Local Rademacher complexities and oracle inequalities in risk minimization. (2004 IMS Medallion Lecture). (With discussions and rejoinder)
- Local Rademacher complexities
- Statistical guarantees for the EM algorithm: from population to sample-based analysis
- Hypothesis test for normal mixture models: the EM approach
- Convergence of latent mixing measures in finite and infinite mixture models
- Optimal rate of convergence for finite mixture models
- Non-finite Fisher information and homogeneity: an EM approach
- Strong identifiability and optimal minimax rates for finite mixture estimation
- Estimating the Coefficients of a Mixture of Two Linear Regressions by Expectation Maximization
- Simultaneous Clustering and Estimation of Heterogeneous Graphical Models
- Asymptotic Convergence Properties of the EM Algorithm for Mixture of Experts
Cited In (10)
- Supermix: sparse regularization for mixtures
- Sequential estimation for mixture of regression models for heterogeneous population
- A diffusion process perspective on posterior contraction rates for parameters
- Improved convergence guarantees for learning Gaussian mixture models by EM and gradient EM
- A Doubly Enhanced EM Algorithm for Model-Based Tensor Clustering
- Sharp global convergence guarantees for iterative nonconvex optimization with random data
- Statistical convergence of the EM algorithm on Gaussian mixture models
- Network Gradient Descent Algorithm for Decentralized Federated Learning
- Randomly initialized EM algorithm for two-component Gaussian mixture achieves near optimality in \(O(\sqrt{n})\) iterations
- Iterative algorithm for discrete structure recovery
Uses Software
This page was built for publication: Singularity, misspecification and the convergence rate of EM
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q1996764)