Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions
From MaRDI portal
Publication:5223937
DOI10.1109/TIT.2018.2879057zbMATH Open1432.62133arXiv1711.04238OpenAlexW2962952126WikidataQ128953017 ScholiaQ128953017MaRDI QIDQ5223937FDOQ5223937
Authors: Biao Chen, Peng-fei Yang
Publication date: 19 July 2019
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Universal hypothesis testing refers to the problem of deciding whether samples come from a nominal distribution or an unknown distribution that is different from the nominal distribution. Hoeffding's test, whose test statistic is equivalent to the empirical Kullback-Leibler divergence (KLD), is known to be asymptotically optimal for distributions defined on finite alphabets. With continuous observations, however, the discontinuity of the KLD in the distribution functions results in significant complications for universal hypothesis testing. This paper introduces a robust version of the classical KLD, defined as the KLD from a distribution to the L'evy ball of a known distribution. This robust KLD is shown to be continuous in the underlying distribution function with respect to the weak convergence. The continuity property enables the development of a universal hypothesis test for continuous observations that is shown to be asymptotically optimal for continuous distributions in the same sense as that of the Hoeffding's test for discrete distributions.
Full work available at URL: https://arxiv.org/abs/1711.04238
Cited In (2)
This page was built for publication: Robust Kullback-Leibler Divergence and Universal Hypothesis Testing for Continuous Distributions
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5223937)