A Characterization of All Single-Integral, Non-Kernel Divergence Estimators
From MaRDI portal
Publication:5211552
DOI10.1109/TIT.2019.2937527zbMATH Open1433.62104OpenAlexW2970713469MaRDI QIDQ5211552FDOQ5211552
Authors: Soham Jana, Ayanendranath Basu
Publication date: 28 January 2020
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/tit.2019.2937527
Recommendations
- Nonparametric estimation of Kullback-Leibler divergence
- scientific article; zbMATH DE number 5959348
- Kernel estimators and the Dvoretzky–Kiefer–Wolfowitz inequality
- Characterizing variation of nonparametric random probability measures using the Kullback–Leibler divergence
- A generalization of the Kullback-Leibler divergence and its properties
- A note on the universal consistency of the kernel distribution function estimator
- Optimal convergence properties of kernel density estimators without differentiability conditions
- On deviations between kernel-type estimators of a distribution density in \(p\geq2\) independent samples
Cited In (4)
This page was built for publication: A Characterization of All Single-Integral, Non-Kernel Divergence Estimators
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5211552)