Entropy and the Law of Small Numbers
From MaRDI portal
Publication:3547166
Abstract: Two new information-theoretic methods are introduced for establishing Poisson approximation inequalities. First, using only elementary information-theoretic techniques it is shown that, when is the sum of the (possibly dependent) binary random variables , with and , then �en D(P_{S_n}|Pol)leq sum_{i=1}^n p_i^2 + Big[sum_{i=1}^nH(X_i) - H(X_1,X_2,..., X_n)Big], een where is the relative entropy between the distribution of and the Poisson() distribution. The first term in this bound measures the individual smallness of the and the second term measures their dependence. A general method is outlined for obtaining corresponding bounds when approximating the distribution of a sum of general discrete random variables by an infinitely divisible distribution. Second, in the particular case when the are independent, the following sharper bound is established, �en D(P_{S_n}|Pol)leq frac{1}{lambda} sum_{i=1}^n frac{p_i^3}{1-p_i}, % label{eq:abs2} een and it is also generalized to the case when the are general integer-valued random variables. Its proof is based on the derivation of a subadditivity property for a new discrete version of the Fisher information, and uses a recent logarithmic Sobolev inequality for the Poisson distribution.
Recommendations
Cited in
(25)- Compound Poisson approximation via information functionals
- Discrete versions of the transport equation and the Shepp-Olkin conjecture
- Log-concavity and the maximum entropy property of the Poisson distribution
- Poisson approximation
- scientific article; zbMATH DE number 4007340 (Why is no real title available?)
- Entropy inequalities for sums in prime cyclic groups
- Compact Proofs of Retrievability
- Improved lower bounds on the total variation distance for the Poisson approximation
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- A Charlier-Parseval approach to Poisson approximation and its applications
- Majorization and Rényi entropy inequalities via Sperner theory
- Information-theoretic convergence of extreme values to the Gumbel distribution
- Log-concavity and strong log-concavity: a review
- Entropy and thinning of discrete random variables
- Non-uniform bounds in the Poisson approximation with applications to informational distances. I
- The entropic Erdős-Kac limit theorem
- A discrete log-Sobolev inequality under a Bakry-Émery type condition
- Nonuniform bounds in the Poisson approximation with applications to informational distances. II
- Convergence of Markov chains in information divergence
- An information-theoretic proof of a finite de Finetti theorem
- Stam inequality on Z_n
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Strong log-concavity is preserved by convolution
- Entropy and the discrete central limit theorem
- Estimation on reliability models of bearing failure data
This page was built for publication: Entropy and the Law of Small Numbers
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3547166)