Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
From MaRDI portal
Publication:385153
DOI10.1016/J.DAM.2011.08.025zbMATH Open1282.60016DBLPjournals/dam/JohnsonKM13arXiv0912.0581OpenAlexW2217094038WikidataQ60522087 ScholiaQ60522087MaRDI QIDQ385153FDOQ385153
Mokshay Madiman, Ioannis Kontoyiannis, Oliver Johnson
Publication date: 29 November 2013
Published in: Discrete Applied Mathematics (Search for Journal in Brave)
Abstract: Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, {em Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be log-concave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a claw-free graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an information-theoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of log-concavity and ultra-log-concavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.
Full work available at URL: https://arxiv.org/abs/0912.0581
Recommendations
- Log-concavity and the maximum entropy property of the Poisson distribution
- Concentration functions and entropy bounds for discrete log-concave distributions
- scientific article; zbMATH DE number 7295891
- On the Poincaré constant of log-concave measures
- Ultra log-concavity of discrete order statistics
- Concentration between Lévy's inequality and the Poincaré inequality for log-concave densities
- Concentration inequalities for ultra log-concave distributions
- Poisson processes and a log-concave Bernstein theorem
- Concentration of measure principle and entropy-inequalities
- On some inequalities for entropies of discrete probability distributions
Cites Work
- Elements of Information Theory
- Discrete analogues of self-decomposability and stability
- Random Geometric Graphs
- Infinite Divisibility of Integer-Valued Random Variables
- Title not available (Why is that?)
- Title not available (Why is that?)
- Title not available (Why is that?)
- Preservation of log-concavity on summation
- Title not available (Why is that?)
- Ultra logconcave sequences and negative dependence
- Towards a theory of negative dependence.
- Title not available (Why is that?)
- Entropy and the Law of Small Numbers
- On Dedekind's Problem: The Number of Isotone Boolean Functions. II
- Title not available (Why is that?)
- An entropy proof of Bregman's theorem
- Log-concavity and the maximum entropy property of the Poisson distribution
- Correlation inequalities on some partially ordered sets
- On the numbers of independent \(k\)-sets in a claw free graph
- Negative dependence and the geometry of polynomials
- Unimodal, log-concave and Pólya frequency sequences in combinatorics
- Title not available (Why is that?)
- The roots of the independence polynomial of a clawfree graph
- Integral representations and asymptotic expansions for Shannon and Renyi entropies
- Entropy computations via analytic depoissonization
- Solution of Shannon’s problem on the monotonicity of entropy
- Entropy and the central limit theorem
- Binomial and Poisson distributions as maximum entropy distributions
- The number of linear extensions of the Boolean lattice
- Log-concavity of characteristic polynomials and the Bergman fan of matroids
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
- Log-concavity and LC-positivity
- Singularity analysis and asymptotics of Bernoulli sums
- Compound Poisson approximation for nonnegative random variables via Stein's method
- A short proof, based on mixed volumes, of Liggett's theorem on the convolution of ultra-logconcave sequences
- The number of linear extensions of subset ordering
- On log-concave and log-convex infinitely divisible sequences and densities
- Monotonicity and aging properties of random sums
- Entropy, independent sets and antichains: A new approach to Dedekind's problem
- A conjecture on matroids
- Entropy and set cardinality inequalities for partition-determined functions
- Negative correlation and log-concavity
- Uniform stochastic ordering and related inequalities
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- On the maximum entropy of the sum of two dependent random variables
- On the Entropy of Compound Distributions on Nonnegative Integers
- Thinning, Entropy, and the Law of Thin Numbers
- Binomial-Poisson entropic inequalities and the M/M/∞queue
- A strong log-concavity property for measures on Boolean algebras
- Families of Non-disjoint subsets
- Some Results for Discrete Unimodality
- Compound Poisson approximation via information functionals
- Negatively correlated random variables and Mason's conjecture for independent sets in matroids
Cited In (21)
- Clinical site selection problems with probabilistic constraints
- Relative log-concavity and a pair of triangle inequalities
- Geometric and functional inequalities for log-concave probability sequences
- Strong Log-concavity is Preserved by Convolution
- Log-concavity and the maximum entropy property of the Poisson distribution
- Efron's monotonicity property for measures on \(\mathbb{R}^2\)
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Negative dependence and stochastic orderings
- The Discrete Moment Problem with Nonconvex Shape Constraints
- Entropy-variance inequalities for discrete log-concave random variables via degree of freedom
- Majorization and Rényi entropy inequalities via Sperner theory
- Bernoulli sums and Rényi entropy inequalities
- Top-heavy phenomena for transformations
- Log-concavity and strong log-concavity: a review
- Tight Revenue Gaps among Multiunit Mechanisms
- Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)
- Log-concavity of compound distributions with applications in stochastic optimization
- Concentration inequalities for ultra log-concave distributions
- Entropy Inequalities for Sums in Prime Cyclic Groups
- Entropy and the discrete central limit theorem
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
This page was built for publication: Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q385153)