Entropy and the Law of Small Numbers

From MaRDI portal
Publication:3547166




Abstract: Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when Sn=sumi=1nXi is the sum of the (possibly dependent) binary random variables X1,X2,...,Xn, with E(Xi)=pi and E(Sn)=la, then �en D(P_{S_n}|Pol)leq sum_{i=1}^n p_i^2 + Big[sum_{i=1}^nH(X_i) - H(X_1,X_2,..., X_n)Big], een where D(PSn|Po(la)) is the relative entropy between the distribution of Sn and the Poisson(la) distribution. The first term in this bound measures the individual smallness of the Xi and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the Xi are independent, the following sharper bound is established, �en D(P_{S_n}|Pol)leq frac{1}{lambda} sum_{i=1}^n frac{p_i^3}{1-p_i}, % label{eq:abs2} een and it is also generalized to the case when the Xi are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.




Cited in
(25)






This page was built for publication: Entropy and the Law of Small Numbers

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547166)