Hellinger distance and Kullback-Leibler loss for the kernel density estimator
From MaRDI portal
Publication:1314730
DOI10.1016/0167-7152(93)90022-BzbMath0789.62029MaRDI QIDQ1314730
Publication date: 19 June 1994
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
histogram; kernel density estimator; Akaike's information criterion; compactly supported densities; likelihood cross-validation; optimal window width; expected Kullback-Leibler loss; mean-Hellinger distance
Related Items
A general and fast convergent bandwidth selection method of kernel estimator, Weighted Hellinger distance as an error criterion for bandwidth selection in kernel estimation, On two recent papers of Y. Kanazawa
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Hellinger distance and Akaike's information criterion for the histogram
- On the relationship between stability of extreme order statistics and convergence of the maximum likelihood kernel density estimate
- Akaike's information criterion and Kullback-Leibler loss for histogram density estimation
- On Kullback-Leibler loss and density estimation
- An optimal variable cell histogram based on the sample spacings
- An optimal variable cell histogram
- On the histogram as a density estimator:L 2 theory
- Monte Carlo Study of Three Data-Based Nonparametric Probability Density Estimators
- On the Choice of Smoothing Parameters for Parzen Estimators of Probability Density Functions
- On Estimation of a Probability Density Function and Mode