Compound Poisson approximation via information functionals

From MaRDI portal




Abstract: An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Let PSn be the distribution of a sum Sn=SumnYi of independent integer-valued random variables Yi. Nonasymptotic bounds are derived for the distance between PSn and an appropriately chosen compound Poisson law. In the case where all Yi have the same conditional distribution given Yieq0, a bound on the relative entropy distance between PSn and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the Yi have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals," and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.









This page was built for publication: Compound Poisson approximation via information functionals

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q638324)