On robustness of model selection criteria based on divergence measures: Generalizations of BHHJ divergence-based method and comparison
From MaRDI portal
Publication:6549205
Cites work
- scientific article; zbMATH DE number 4215168 (Why is no real title available?)
- scientific article; zbMATH DE number 3911472 (Why is no real title available?)
- scientific article; zbMATH DE number 3954047 (Why is no real title available?)
- scientific article; zbMATH DE number 1560711 (Why is no real title available?)
- scientific article; zbMATH DE number 3444596 (Why is no real title available?)
- scientific article; zbMATH DE number 3252891 (Why is no real title available?)
- $\alpha$-Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes
- A Bayesian analysis of the minimum AIC procedure
- A comparison of related density-based minimum divergence estimators
- A discrete probabilistic model for analyzing pairwise comparison matrices
- A generalized divergence for statistical inference
- A model selection criterion based on the BHHJ measure of divergence
- A new look at the statistical model identification
- A robust generalization and asymptotic properties of the model selection criterion family
- AIC for the Lasso in generalized linear models
- AIC for the non-concave penalized likelihood method
- An improved divergence information criterion for the determination of the order of an AR process
- Bayesian information criteria and smoothing parameter selection in radial basis function networks
- Choosing a robustness tuning parameter
- Decomposable pseudodistances and applications in statistical estimation
- Estimating the dimension of a model
- Generalised information criteria in model selection
- Markov Processes and the H-Theorem
- Minimax Aspects of Bounded-Influence Regression
- Negative binomial and mixed poisson regression
- On Information and Sufficiency
- On properties of the \((\Phi , a)\)-power divergence family with applications in goodness of fit tests
- On the `optimal' density power divergence tuning parameter
- On the consistency and the robustness in model selection criteria
- Robust and efficient estimation by minimising a density power divergence
- Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach
- Robust estimation in generalized linear models: the density power divergence approach
- Robust estimation in the normal mixture model
- Robust parameter estimation with a small bias against heavy contamination
- Robust statistical inference based on the \(C\)-divergence family
- Robust tests based on minimum density power divergence estimators and saddlepoint approxi\-mations
- Several applications of divergence criteria in continuous families
- Testing statistical hypotheses based on the density power divergence
This page was built for publication: On robustness of model selection criteria based on divergence measures: Generalizations of BHHJ divergence-based method and comparison
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q6549205)