Statistical inference based on bridge divergences
From MaRDI portal
Publication:2000745
DOI10.1007/S10463-018-0665-XzbMATH Open1421.62031arXiv1706.05745OpenAlexW2964336325WikidataQ129794659 ScholiaQ129794659MaRDI QIDQ2000745FDOQ2000745
Arun Kumar Kuchibhotla, Somabha Mukherjee, Ayanendranath Basu
Publication date: 28 June 2019
Published in: Annals of the Institute of Statistical Mathematics (Search for Journal in Brave)
Abstract: M-estimators offer simple robust alternatives to the maximum likelihood estimator. Much of the robustness literature, however, has focused on the problems of location, location-scale and regression estimation rather than on estimation of general parameters. The density power divergence (DPD) and the logarithmic density power divergence (LDPD) measures provide two classes of competitive M-estimators (obtained from divergences) in general parametric models which contain the MLE as a special case. In each of these families, the robustness of the estimator is achieved through a density power down-weighting of outlying observations. Both the families have proved to be very useful tools in the area of robust inference. However, the relation and hierarchy between the minimum distance estimators of the two families are yet to be comprehensively studied or fully established. Given a particular set of real data, how does one choose an optimal member from the union of these two classes of divergences? In this paper, we present a generalized family of divergences incorporating the above two classes; this family provides a smooth bridge between the DPD and the LDPD measures. This family helps to clarify and settle several longstanding issues in the relation between the important families of DPD and LDPD, apart from being an important tool in different areas of statistical inference in its own right.
Full work available at URL: https://arxiv.org/abs/1706.05745
Cites Work
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Asymptotic Statistics
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Inference for multivariate normal mixtures
- Minimum disparity estimation for continuous models: Efficiency, distributions and robustness
- A comparison of related density-based minimum divergence estimators
- Robust and efficient estimation by minimising a density power divergence
- Robust parameter estimation with a small bias against heavy contamination
- Decomposable pseudodistances and applications in statistical estimation
- Choosing a robustness tuning parameter
- A universally consistent modification of maximum likelihood
- Affine invariant divergences associated with proper composite scoring rules and their applications
- Normalized estimating equation for robust parameter estimation
- Inference based on adaptive grid selection of probability transforms
Cited In (3)
This page was built for publication: Statistical inference based on bridge divergences
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2000745)