Compound Poisson approximation via information functionals
From MaRDI portal
Abstract: An information-theoretic development is given for the problem of compound Poisson approximation, which parallels earlier treatments for Gaussian and Poisson approximation. Let be the distribution of a sum of independent integer-valued random variables . Nonasymptotic bounds are derived for the distance between and an appropriately chosen compound Poisson law. In the case where all have the same conditional distribution given , a bound on the relative entropy distance between and the compound Poisson distribution is derived, based on the data-processing property of relative entropy and earlier Poisson approximation results. When the have arbitrary distributions, corresponding bounds are derived in terms of the total variation distance. The main technical ingredient is the introduction of two "information functionals," and the analysis of their properties. These information functionals play a role analogous to that of the classical Fisher information in normal approximation. Detailed comparisons are made between the resulting inequalities and related bounds.
Recommendations
Cited in
(16)- Compound Poisson approximation for unbounded functions on a group, with application to large deviations
- Improved lower bounds on the total variation distance for the Poisson approximation
- Entropy inequalities for sums in prime cyclic groups
- Majorization and Rényi entropy inequalities via Sperner theory
- Polars and subgradients of mixtures of information functions
- An information-theoretic proof of a finite de Finetti theorem
- BOUNDS ON EXTROPY WITH VARIATIONAL DISTANCE CONSTRAINT
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)
- Compound Poisson approximation to convolutions of compound negative binomial variables
- Compound Poisson approximation
- Entropy and the fourth moment phenomenon
- Measure concentration for compound Poisson distributions
- Entropy and the discrete central limit theorem
- Entropy and the Law of Small Numbers
This page was built for publication: Compound Poisson approximation via information functionals
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q638324)