Estimation of KL Divergence: Optimal Minimax Rate
From MaRDI portal
Publication:4569211
DOI10.1109/TIT.2018.2805844zbMATH Open1390.94614arXiv1607.02653OpenAlexW2963305780MaRDI QIDQ4569211FDOQ4569211
Authors: Yuheng Bu, Shaofeng Zou, Yingbin Liang, Venugopal V. Veeravalli
Publication date: 27 June 2018
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: The problem of estimating the Kullback-Leibler divergence between two unknown distributions and is studied, under the assumption that the alphabet size of the distributions can scale to infinity. The estimation is based on independent samples drawn from and independent samples drawn from . It is first shown that there does not exist any consistent estimator that guarantees asymptotically small worst-case quadratic risk over the set of all pairs of distributions. A restricted set that contains pairs of distributions, with density ratio bounded by a function is further considered. {An augmented plug-in estimator is proposed, and its worst-case quadratic risk is shown to be within a constant factor of , if and exceed a constant factor of and , respectively.} Moreover, the minimax quadratic risk is characterized to be within a constant factor of , if and exceed a constant factor of and , respectively. The lower bound on the minimax quadratic risk is characterized by employing a generalized Le Cam's method. A minimax optimal estimator is then constructed by employing both the polynomial approximation and the plug-in approaches.
Full work available at URL: https://arxiv.org/abs/1607.02653
Recommendations
- Minimax Optimal Estimation of KL Divergence for Continuous Distributions
- Minimization of the Kullback-Leibler divergence over a log-normal exponential arc
- Minimum \(f\)-divergence estimators and quasi-likelihood functions
- Optimal rates and constants in \(L_ 2\)-minimax estimation of probability density functions
- Parameter estimation based on cumulative Kullback-Leibler divergence
- Nonparametric estimation of Kullback-Leibler divergence
- Minimum \(K_\phi\)-divergence estimator.
- Optimal robust estimates using the Kullback-Leibler divergence
Cited In (5)
- Minimax Optimal Estimation of KL Divergence for Continuous Distributions
- Optimal rates of entropy estimation over Lipschitz balls
- Optimal Estimation of Wasserstein Distance on a Tree With an Application to Microbiome Studies
- Comparing and Weighting Imperfect Models Using D-Probabilities
- Approximate profile maximum likelihood
This page was built for publication: Estimation of KL Divergence: Optimal Minimax Rate
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4569211)