Statistical inference based on bridge divergences
From MaRDI portal
Publication:2000745
Abstract: M-estimators offer simple robust alternatives to the maximum likelihood estimator. Much of the robustness literature, however, has focused on the problems of location, location-scale and regression estimation rather than on estimation of general parameters. The density power divergence (DPD) and the logarithmic density power divergence (LDPD) measures provide two classes of competitive M-estimators (obtained from divergences) in general parametric models which contain the MLE as a special case. In each of these families, the robustness of the estimator is achieved through a density power down-weighting of outlying observations. Both the families have proved to be very useful tools in the area of robust inference. However, the relation and hierarchy between the minimum distance estimators of the two families are yet to be comprehensively studied or fully established. Given a particular set of real data, how does one choose an optimal member from the union of these two classes of divergences? In this paper, we present a generalized family of divergences incorporating the above two classes; this family provides a smooth bridge between the DPD and the LDPD measures. This family helps to clarify and settle several longstanding issues in the relation between the important families of DPD and LDPD, apart from being an important tool in different areas of statistical inference in its own right.
Recommendations
- A generalized divergence for statistical inference
- Robust statistical inference based on the \(C\)-divergence family
- Robust and efficient estimation by minimising a density power divergence
- Robust inference using the exponential-polynomial divergence
- Testing statistical hypotheses based on the density power divergence
Cites work
- scientific article; zbMATH DE number 1220667 (Why is no real title available?)
- scientific article; zbMATH DE number 795297 (Why is no real title available?)
- scientific article; zbMATH DE number 847272 (Why is no real title available?)
- A comparison of related density-based minimum divergence estimators
- A universally consistent modification of maximum likelihood
- Affine invariant divergences associated with proper composite scoring rules and their applications
- Asymptotic Statistics
- Choosing a robustness tuning parameter
- Decomposable pseudodistances and applications in statistical estimation
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Inference based on adaptive grid selection of probability transforms
- Inference for multivariate normal mixtures
- Mathematical statistics. Basic ideas and selected topics. Volume I
- Minimum disparity estimation for continuous models: Efficiency, distributions and robustness
- Normalized estimating equation for robust parameter estimation
- Robust and efficient estimation by minimising a density power divergence
- Robust parameter estimation with a small bias against heavy contamination
Cited in
(4)- A unified approach to the Pythagorean identity and projection theorem for a class of divergences based on M-estimations
- A generalized divergence for statistical inference
- Robust density power divergence estimates for panel data models
- scientific article; zbMATH DE number 2221907 (Why is no real title available?)
This page was built for publication: Statistical inference based on bridge divergences
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2000745)