Akaike's information criterion and Kullback-Leibler loss for histogram density estimation (Q1122259): Difference between revisions
From MaRDI portal
Changed an Item |
Set profile property. |
||
Property / MaRDI profile type | |||
Property / MaRDI profile type: MaRDI publication profile / rank | |||
Normal rank |
Revision as of 02:16, 5 March 2024
scientific article
Language | Label | Description | Also known as |
---|---|---|---|
English | Akaike's information criterion and Kullback-Leibler loss for histogram density estimation |
scientific article |
Statements
Akaike's information criterion and Kullback-Leibler loss for histogram density estimation (English)
0 references
1990
0 references
We take a detailed look at Akaike's information criterion (AIC) and Kullback-Leibler cross-validation (KLCV) in the problem of histogram density estimation. Two different definitions of ``number of unknown parameters'' in AIC are considered. A careful description is given of the influence of density tail properties on performance of both types of AIC and on KLCV. A number of practical conclusions emerge. In particular, we find that AIC will often give problems when used with heavy-tailed unbounded densities, but can perform quite well with compactly supported densities. In the latter case, both types of AIC produce similar results, and those results will sometimes be asymptotically equivalent to the ones obtained from KLCV. However, depending on the shape of the true density, the KLCV method can fail to balance ``bias'' and ``variance'' components of loss, with the result that KLCV and AIC will produce very different results.
0 references
Akaike's information criterion
0 references
AIC
0 references
Kullback-Leibler cross-validation
0 references
histogram density estimation
0 references
number of unknown parameters
0 references
density tail properties
0 references
heavy-tailed unbounded densities
0 references
compactly supported densities
0 references