Robust estimation in the normal mixture model
From MaRDI portal
Publication:2500650
DOI10.1016/j.jspi.2005.03.008zbMath1104.62017OpenAlexW2046399133MaRDI QIDQ2500650
Shinto Eguchi, Hironori Fujisawa
Publication date: 17 August 2006
Published in: Journal of Statistical Planning and Inference (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1016/j.jspi.2005.03.008
robustnessmodel selectionefficiencycross-validationmodified likelihoodunbounded likelihoodgross-error-sensitivityinvader assay datared blood cell SLC data
Lua error in Module:PublicationMSCList at line 37: attempt to index local 'msc_result' (a nil value).
Related Items (32)
Robust estimation for the covariance matrix of multivariate time series based on normal mixtures ⋮ Minimum density power divergence estimator for Poisson autoregressive models ⋮ A robust approach for testing parameter change in Poisson autoregressive models ⋮ Recent progress in parameter change test for integer-valued time series models ⋮ Duality of maximum entropy and minimum divergence ⋮ A moment-distance hybrid method for estimating a mixture of two symmetric densities ⋮ Iterative Bias Correction of the Cross-Validation Criterion ⋮ Test for parameter change in the presence of outliers: the density power divergence-based approach ⋮ Robust fitting of mixture models using weighted complete estimating equations ⋮ Robust estimation in stochastic frontier models ⋮ On the consistency and the robustness in model selection criteria ⋮ A robust EM clustering approach: ROBEM ⋮ A class of cross-validatory model selection criteria ⋮ A modified EM algorithm for mixture models based on Bregman divergence ⋮ Asymptotic comparison of semi-supervised and supervised linear discriminant functions for heteroscedastic normal populations ⋮ Robust estimation for zero-inflated poisson autoregressive models based on density power divergence ⋮ Robust estimation for bivariate integer-valued autoregressive models based on minimum density power divergence ⋮ Two-stage hierarchical modeling for analysis of subpopulations in conditional distributions ⋮ Robust estimation for the order of finite mixture models ⋮ Generalized weighted likelihood density estimators with application to finite mixture of exponential family distributions ⋮ Entropy and divergence associated with power function and the statistical application ⋮ Maximum Regularized Likelihood Estimator of Finite Mixtures with a Structural Model ⋮ Robust estimation for order of hidden Markov models based on density power divergences ⋮ Robust estimation for general integer-valued time series models ⋮ Robust test for structural instability in dynamic factor models ⋮ Sequential change point test in the presence of outliers: the density power divergence based approach ⋮ Consistency of minimizing a penalized density power divergence estimator for mixing distribution ⋮ Minimum density power divergence estimator for covariance matrix based on skew \(t\) distribution ⋮ Robust estimation for the covariance matrix of multi-variate time series ⋮ Robust estimation for copula Parameter in SCOMDY models ⋮ Robust tests for the equality of two normal means based on the density power divergence ⋮ Robust estimation of location and concentration parameters for the von Mises-Fisher distribution
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- A constrained formulation of maximum-likelihood estimation for normal mixture distributions
- Minimum Hellinger distance estimates for parametric models
- Robust estimation of \(k\)-component univariate normal mixtures
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Differential-geometrical methods in statistics
- A Modified Likelihood Ratio Test for Homogeneity in Finite Mixture Models
- A comparison of related density-based minimum divergence estimators
- A Comparison of Minimum Distance and Maximum Likelihood Estimation of a Mixture Proportion
- On the Strong Law of Large Numbers and Related Results for Quasi-Stationary Sequences
- Robust and efficient estimation by minimising a density power divergence
- A Graphical Technique for Determining the Number of Components in a Mixture of Normals
- Minimum Hellinger Distance Estimation for Finite Mixture Models
- Penalized Maximum Likelihood Estimator for Normal Mixtures
- Penalized minimum‐distance estimates in finite mixture models
- Bayesian information criteria and smoothing parameter selection in radial basis function networks
- Robust Statistics
This page was built for publication: Robust estimation in the normal mixture model