Decomposable pseudodistances and applications in statistical estimation
From MaRDI portal
(Redirected from Publication:447622)
Abstract: The aim of this paper is to introduce new statistical criterions for estimation, suitable for inference in models with common continuous support. This proposal is in the direct line of a renewed interest for divergence based inference tools imbedding the most classical ones, such as maximum likelihood, Chi-square or Kullback Leibler. General pseudodistances with decomposable structure are considered, they allowing to define minimum pseudodistance estimators, without using nonparametric density estimators. A special class of pseudodistances indexed by {alpha}>0, leading for {alpha}downarrow0 to the Kulback Leibler divergence, is presented in detail. Corresponding estimation criteria are developed and asymptotic properties are studied. The estimation method is then extended to regression models. Finally, some examples based on Monte Carlo simulations are discussed.
Recommendations
Cites work
- scientific article; zbMATH DE number 2221907 (Why is no real title available?)
- A New Test Procedure of Independence in Copula Models via χ2-Divergence
- Asymptotic Statistics
- Dual divergence estimators and tests: robustness results
- New estimates and tests of independence in some copula models
- On Divergences and Informations in Statistics and Information Theory
- On distributional properties and goodness-of-fit tests for generalized measures of divergence
- On empirical likelihood for semiparametric two-sample density ratio models
- Parametric estimation and tests through divergences and the duality technique
- Robust Statistics
- Robust and efficient estimation by minimising a density power divergence
- Robust tests based on dual divergence estimators and saddlepoint approximations
- Several applications of divergence criteria in continuous families
Cited in
(21)- Model selection for independent not identically distributed observations based on Rényi's pseudodistances
- Pseudo-R 2 statistics under complex sampling
- Influence analysis of robust Wald-type tests
- Some universal insights on divergences for statistics, machine learning and artificial intelligence
- Testing composite hypothesis based on the density power divergence
- Statistical inference based on bridge divergences
- Robust Wald-type tests based on minimum Rényi pseudodistance estimators for the multiple linear regression model
- The logarithmic super divergence and asymptotic inference properties
- Estimation of entropy-type integral functionals
- Robust variable selection for finite mixture regression models
- Existence, consistency and computer simulation for selected variants of minimum distance estimators.
- Robust statistical inference based on the \(C\)-divergence family
- Projection theorems and estimating equations for power-law models
- A unified approach to the Pythagorean identity and projection theorem for a class of divergences based on M-estimations
- Several applications of divergence criteria in continuous families
- Robustness of dual divergence estimators for models satisfying linear constraints
- Selecting an adaptive sequence for computing recursive M-estimators in multivariate linear regression models
- Minimum Rényi pseudodistance estimators for logistic regression models
- Robust mean variance optimization problem under Rényi divergence information
- Robust approach for comparing two dependent normal populations through Wald-type tests based on Rényi's pseudodistance estimators
- On robustness of model selection criteria based on divergence measures: Generalizations of BHHJ divergence-based method and comparison
This page was built for publication: Decomposable pseudodistances and applications in statistical estimation
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q447622)