Hellinger distance and Kullback-Leibler loss for the kernel density estimator
From MaRDI portal
Publication:1314730
DOI10.1016/0167-7152(93)90022-BzbMath0789.62029MaRDI QIDQ1314730
Publication date: 19 June 1994
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
histogramkernel density estimatorAkaike's information criterioncompactly supported densitieslikelihood cross-validationoptimal window widthexpected Kullback-Leibler lossmean-Hellinger distance
Related Items (6)
β-divergence loss for the kernel density estimation with bias reduced ⋮ On Variance-Stabilizing Multivariate Non Parametric Regression Estimation ⋮ On two recent papers of Y. Kanazawa ⋮ A general and fast convergent bandwidth selection method of kernel estimator ⋮ Weighted Hellinger distance as an error criterion for bandwidth selection in kernel estimation ⋮ \textsc{CaDET}: interpretable parametric conditional density estimation with decision trees and forests
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Hellinger distance and Akaike's information criterion for the histogram
- On the relationship between stability of extreme order statistics and convergence of the maximum likelihood kernel density estimate
- Akaike's information criterion and Kullback-Leibler loss for histogram density estimation
- On Kullback-Leibler loss and density estimation
- An optimal variable cell histogram based on the sample spacings
- An optimal variable cell histogram
- On the histogram as a density estimator:L 2 theory
- Monte Carlo Study of Three Data-Based Nonparametric Probability Density Estimators
- On the Choice of Smoothing Parameters for Parzen Estimators of Probability Density Functions
- On Estimation of a Probability Density Function and Mode
This page was built for publication: Hellinger distance and Kullback-Leibler loss for the kernel density estimator