Statistical inference for normal mixtures with unknown number of components
From MaRDI portal
Publication:2084470
DOI10.1214/22-EJS2061OpenAlexW4312843820MaRDI QIDQ2084470
Mian Huang, Weixin Yao, Shiyi Tang
Publication date: 18 October 2022
Published in: Electronic Journal of Statistics (Search for Journal in Brave)
Full work available at URL: https://projecteuclid.org/journals/electronic-journal-of-statistics/volume-16/issue-2/Statistical-inference-for-normal-mixtures-with-unknown-number-of-components/10.1214/22-EJS2061.full
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Nearly unbiased variable selection under minimax concave penalty
- Hypothesis testing for finite mixture models
- Sparse estimators and the oracle property, or the return of Hodges' estimator
- Convergence rates of parameter estimation for some weakly identifiable finite mixtures
- Hypothesis test for normal mixture models: the EM approach
- Note on the consistency of the maximum likelihood estimate for nonidentifiable distributions
- Asymptotics for likelihood ratio tests under loss of identifiability
- Testing the order of a model using locally conic parametrization: Population mixtures and stationary ARMA processes
- Likelihood ratio inequalities with applications to various mixtures
- Nonconcave penalized likelihood with a diverging number of parameters.
- Asymptotics for the likelihood ratio test in a two-component normal mixture model
- Optimal rate of convergence for finite mixture models
- Strong identifiability and optimal minimax rates for finite mixture estimation
- Estimating the number of components in finite mixture models via the group-sort-fuse procedure
- Likelihood inference in some finite mixture models
- Finite mixture and Markov switching models.
- Strong oracle optimality of folded concave penalized estimation
- Model Selection for Gaussian Mixture Models
- Variable Selection in Finite Mixture of Regression Models
- Penalized Maximum Likelihood Estimator for Normal Mixtures
- Variable Selection via Nonconcave Penalized Likelihood and its Oracle Properties
- Dealing With Label Switching in Mixture Models
- Testing for a Finite Mixture Model with Two Components
- Detecting a Major Gene in an F2 Population
- Likelihood-Based Selection and Sharp Parameter Estimation
- Bayesian Mixture Labeling by Highest Posterior Density
- Order Selection in Finite Mixture Models With a Nonsmooth Penalty
This page was built for publication: Statistical inference for normal mixtures with unknown number of components