Entropy and the Law of Small Numbers

From MaRDI portal
Publication:3547166

DOI10.1109/TIT.2004.840861zbMATH Open1297.94016arXivmath/0211020OpenAlexW3106311672WikidataQ60522099 ScholiaQ60522099MaRDI QIDQ3547166FDOQ3547166

Ioannis Kontoyiannis, Peter Harremoës, Oliver Johnson

Publication date: 21 December 2008

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when Sn=sumi=1nXi is the sum of the (possibly dependent) binary random variables X1,X2,...,Xn, with E(Xi)=pi and E(Sn)=la, then �en D(P_{S_n}|Pol)leq sum_{i=1}^n p_i^2 + Big[sum_{i=1}^nH(X_i) - H(X_1,X_2,..., X_n)Big], een where D(PSn|Po(la)) is the relative entropy between the distribution of Sn and the Poisson(la) distribution. The first term in this bound measures the individual smallness of the Xi and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the Xi are independent, the following sharper bound is established, �en D(P_{S_n}|Pol)leq frac{1}{lambda} sum_{i=1}^n frac{p_i^3}{1-p_i}, % label{eq:abs2} een and it is also generalized to the case when the Xi are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.


Full work available at URL: https://arxiv.org/abs/math/0211020






Cited In (20)


   Recommendations





This page was built for publication: Entropy and the Law of Small Numbers

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547166)