Akaike's information criterion and Kullback-Leibler loss for histogram density estimation (Q1122259): Difference between revisions

From MaRDI portal
Import240304020342 (talk | contribs)
Set profile property.
ReferenceBot (talk | contribs)
Changed an Item
Property / cites work
 
Property / cites work: Q4769776 / rank
 
Normal rank
Property / cites work
 
Property / cites work: A comparative study of some kernel-based nonparametric density estimators / rank
 
Normal rank
Property / cites work
 
Property / cites work: Consistent cross-validated density estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the Choice of Smoothing Parameters for Parzen Estimators of Probability Density Functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q4162318 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Selection of Variables in Discriminant Analysis by F-Statistic and Error Rate / rank
 
Normal rank
Property / cites work
 
Property / cites work: On Kullback-Leibler loss and density estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the use of compactly supported density estimates in problems of discrimination / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3911791 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3810747 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3260839 / rank
 
Normal rank
Property / cites work
 
Property / cites work: An asymptotically efficient solution to the bandwidth problem of kernel density estimation / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q3800883 / rank
 
Normal rank
Property / cites work
 
Property / cites work: On optimal and data-based histograms / rank
 
Normal rank
Property / cites work
 
Property / cites work: Q5369128 / rank
 
Normal rank
Property / cites work
 
Property / cites work: Akaike's information criterion and the histogram / rank
 
Normal rank
Property / cites work
 
Property / cites work: A Comparative Study of Kernel-Based Density Estimates for Categorical Data / rank
 
Normal rank

Revision as of 14:48, 19 June 2024

scientific article
Language Label Description Also known as
English
Akaike's information criterion and Kullback-Leibler loss for histogram density estimation
scientific article

    Statements

    Akaike's information criterion and Kullback-Leibler loss for histogram density estimation (English)
    0 references
    1990
    0 references
    We take a detailed look at Akaike's information criterion (AIC) and Kullback-Leibler cross-validation (KLCV) in the problem of histogram density estimation. Two different definitions of ``number of unknown parameters'' in AIC are considered. A careful description is given of the influence of density tail properties on performance of both types of AIC and on KLCV. A number of practical conclusions emerge. In particular, we find that AIC will often give problems when used with heavy-tailed unbounded densities, but can perform quite well with compactly supported densities. In the latter case, both types of AIC produce similar results, and those results will sometimes be asymptotically equivalent to the ones obtained from KLCV. However, depending on the shape of the true density, the KLCV method can fail to balance ``bias'' and ``variance'' components of loss, with the result that KLCV and AIC will produce very different results.
    0 references
    Akaike's information criterion
    0 references
    AIC
    0 references
    Kullback-Leibler cross-validation
    0 references
    histogram density estimation
    0 references
    number of unknown parameters
    0 references
    density tail properties
    0 references
    heavy-tailed unbounded densities
    0 references
    compactly supported densities
    0 references
    0 references

    Identifiers