On robustness of model selection criteria based on divergence measures: Generalizations of BHHJ divergence-based method and comparison
From MaRDI portal
Publication:6549205
DOI10.1080/03610926.2022.2155788MaRDI QIDQ6549205FDOQ6549205
Authors: Sumito Kurata
Publication date: 3 June 2024
Published in: Communications in Statistics. Theory and Methods (Search for Journal in Brave)
Cites Work
- Estimating the dimension of a model
- Negative binomial and mixed poisson regression
- Title not available (Why is that?)
- A new look at the statistical model identification
- On Information and Sufficiency
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Bayesian information criteria and smoothing parameter selection in radial basis function networks
- A Bayesian analysis of the minimum AIC procedure
- Robust estimation in the normal mixture model
- A comparison of related density-based minimum divergence estimators
- Robust and efficient estimation by minimising a density power divergence
- Generalised information criteria in model selection
- Title not available (Why is that?)
- Robust parameter estimation with a small bias against heavy contamination
- Testing statistical hypotheses based on the density power divergence
- Robust estimation in generalized linear models: the density power divergence approach
- Decomposable pseudodistances and applications in statistical estimation
- Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach
- Choosing a robustness tuning parameter
- $\alpha$-Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes
- Markov Processes and the H-Theorem
- AIC for the Lasso in generalized linear models
- Minimax Aspects of Bounded-Influence Regression
- Several applications of divergence criteria in continuous families
- Robust tests based on minimum density power divergence estimators and saddlepoint approxi\-mations
- A generalized divergence for statistical inference
- A model selection criterion based on the BHHJ measure of divergence
- On the consistency and the robustness in model selection criteria
- On properties of the \((\Phi , a)\)-power divergence family with applications in goodness of fit tests
- AIC for the non-concave penalized likelihood method
- A robust generalization and asymptotic properties of the model selection criterion family
- On the `optimal' density power divergence tuning parameter
- Robust statistical inference based on the \(C\)-divergence family
- An improved divergence information criterion for the determination of the order of an AR process
- A discrete probabilistic model for analyzing pairwise comparison matrices
This page was built for publication: On robustness of model selection criteria based on divergence measures: Generalizations of BHHJ divergence-based method and comparison
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6549205)