Hellinger distance and Akaike's information criterion for the histogram
From MaRDI portal
Publication:689552
DOI10.1016/0167-7152(93)90205-WzbMath0779.62041MaRDI QIDQ689552
Publication date: 2 January 1994
Published in: Statistics \& Probability Letters (Search for Journal in Brave)
Akaike's information criterionKullback-Leibler losscompactly supported densitiesmean Hellinger distanceoptimal histogram cell width
Related Items
Optimizing Time Histograms for Non-Poissonian Spike Trains ⋮ On Variance-Stabilizing Multivariate Non Parametric Regression Estimation ⋮ On two recent papers of Y. Kanazawa ⋮ Bin width selection in multivariate histograms by the combinatorial method ⋮ Automatic data-based bin width selection for rose diagram ⋮ A Method for Selecting the Bin Size of a Time Histogram ⋮ How many bins should be put in a regular histogram ⋮ A comparison of automatic histogram constructions ⋮ Hellinger distance and Kullback-Leibler loss for the kernel density estimator
Cites Work
- Unnamed Item
- Unnamed Item
- Unnamed Item
- Akaike's information criterion and Kullback-Leibler loss for histogram density estimation
- An optimal variable cell histogram based on the sample spacings
- On equivalence of infinite product measures
- On optimal and data-based histograms
- Akaike's information criterion and the histogram
- An optimal variable cell histogram
- On the histogram as a density estimator:L 2 theory
- Decision Rules, Based on the Distance, for Problems of Fit, Two Samples, and Estimation
This page was built for publication: Hellinger distance and Akaike's information criterion for the histogram