The minimum \(S\)-divergence estimator under continuous models: the Basu-Lindsay approach
From MaRDI portal
Publication:2359161
DOI10.1007/s00362-015-0701-3zbMath1371.62026arXiv1408.1239OpenAlexW1786325621MaRDI QIDQ2359161
Ayanendranath Basu, Abhik Ghosh
Publication date: 27 June 2017
Published in: Statistical Papers (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/1408.1239
Asymptotic properties of parametric estimators (62F12) Robustness and adaptive procedures (parametric inference) (62F35)
Related Items (5)
Test for parameter change in the presence of outliers: the density power divergence-based approach ⋮ On the consistency and the robustness in model selection criteria ⋮ Robust test for structural instability in dynamic factor models ⋮ Improvements in the small sample efficiency of the minimum S-divergence estimators under discrete models ⋮ Influence function analysis of the restricted minimum divergence estimators: a general form
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Robust Bayes estimation using the density power divergence
- The power divergence and the density power divergence families: the mathematical connection
- Robust estimation for independent non-homogeneous observations using density power divergence with applications to linear regression
- Consistency of the kernel density estimator: a survey
- Selection of tuning parameters in bridge regression models via Bayesian information criterion
- Entropy differential metric, distance and divergence measures in probability spaces: A unified approach
- Dual divergence estimators and tests: robustness results
- Divergence-based estimation and testing with misclassified data
- Minimum \(\phi\)-divergence estimator in logistic regression models
- Preliminary phi-divergence test estimators for linear restrictions in a logistic regression model
- Minimum phi-divergence estimators for loglinear models with linear constraints and multinomial sampling
- Consistency of minimizing a penalized density power divergence estimator for mixing distribution
- Do robust estimators work with real data?
- Minimum Hellinger distance estimates for parametric models
- Robust asymptotic statistics
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- Asymptotic behaviour and statistical applications of divergence measures in multinomial populations: A unified study
- The ``automatic robustness of minimum distance functionals
- Minimum disparity estimation for continuous models: Efficiency, distributions and robustness
- On the robustness of a divergence based test of simple statistical hypotheses
- A generalized divergence for statistical inference
- A comparison of related density-based minimum divergence estimators
- Robust Blind Source Separation by Beta Divergence
- Minimum Hellinger Distance Estimation for Multivariate Location and Covariance
- Minimum Hellinger Distance Estimation for the Analysis of Count Data
- Robust and efficient estimation by minimising a density power divergence
- Two approaches to grouping of data and related disparity statistics
- Robust estimation for non-homogeneous data and the selection of the optimal tuning parameter: the density power divergence approach
- Robust sparse regression and tuning parameter selection via the efficient bootstrap information criteria
- Choosing a robustness tuning parameter
- On Information and Sufficiency
- Statistical Inference
This page was built for publication: The minimum \(S\)-divergence estimator under continuous models: the Basu-Lindsay approach