The extended Bregman divergence and parametric estimation
From MaRDI portal
Publication:5089933
DOI10.1080/02331888.2022.2070622OpenAlexW3123595777MaRDI QIDQ5089933
Sancharee Basak, Ayanendranath Basu
Publication date: 15 July 2022
Published in: Unnamed Author (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/2101.09183
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Asymptotic properties of minimum \(S\)-divergence estimator for discrete models
- Minimum Hellinger distance estimates for parametric models
- Efficiency versus robustness: The case for minimum Hellinger distance and related methods
- The B-exponential divergence and its generalizations with applications to parametric estimation
- A generalized divergence for statistical inference
- Robust and efficient estimation by minimising a density power divergence
- $\alpha$-Divergence Is Unique, Belonging to Both $f$-Divergence and Bregman Divergence Classes
- A Characterization of All Single-Integral, Non-Kernel Divergence Estimators
- Choosing a robustness tuning parameter
- Robust Statistics
- On the ‘optimal’ density power divergence tuning parameter
- Statistical Inference
- Robust Statistics
This page was built for publication: The extended Bregman divergence and parametric estimation