Cross-validated density estimates based on Kullback–Leibler information
From MaRDI portal
Publication:4831088
DOI10.1080/10485250310001644583zbMath1076.62036OpenAlexW2061559341MaRDI QIDQ4831088
Alain F. Berlinet, Élodie Brunel
Publication date: 20 December 2004
Published in: Journal of Nonparametric Statistics (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1080/10485250310001644583
Automatic Smoothing ParameterBarron Density EstimatesBinned DataCross-validation TechniquesFunctional EstimationHistogramsKullback-Leibler Divergence
Density estimation (62G07) Asymptotic properties of nonparametric inference (62G20) Strong limit theorems (60F15) Statistical aspects of information-theoretic topics (62B10)
Related Items
Cites Work
- An asymptotically efficient solution to the bandwidth problem of kernel density estimation
- A comparison of cross-validation techniques in density estimation
- On the use of compactly supported density estimates in problems of discrimination
- Akaike's information criterion and Kullback-Leibler loss for histogram density estimation
- On Kullback-Leibler loss and density estimation
- Consistent cross-validated density estimation
- A comparative study of some kernel-based nonparametric density estimators
- An optimal selection of regression variables
- Distribution estimation consistent in total variation and in two types of information divergence
- On the Choice of Smoothing Parameters for Parzen Estimators of Probability Density Functions
- Distribution Estimates Consistent in χ2-Divergence
- On the asymptotic normality of the L1‐ and L2‐errors in histogram density estimation
- About the asymptotic accuracy of Barron density estimates
- An inequality involving multinomial probabilities