Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
From MaRDI portal
(Redirected from Publication:385153)
Abstract: Sufficient conditions are developed, under which the compound Poisson distribution has maximal entropy within a natural class of probability measures on the nonnegative integers. Recently, one of the authors [O. Johnson, {em Stoch. Proc. Appl.}, 2007] used a semigroup approach to show that the Poisson has maximal entropy among all ultra-log-concave distributions with fixed mean. We show via a non-trivial extension of this semigroup approach that the natural analog of the Poisson maximum entropy property remains valid if the compound Poisson distributions under consideration are log-concave, but that it fails in general. A parallel maximum entropy result is established for the family of compound binomial measures. Sufficient conditions for compound distributions to be log-concave are discussed and applications to combinatorics are examined; new bounds are derived on the entropy of the cardinality of a random independent set in a claw-free graph, and a connection is drawn to Mason's conjecture for matroids. The present results are primarily motivated by the desire to provide an information-theoretic foundation for compound Poisson approximation and associated limit theorems, analogous to the corresponding developments for the central limit theorem and for Poisson approximation. Our results also demonstrate new links between some probabilistic methods and the combinatorial notions of log-concavity and ultra-log-concavity, and they add to the growing body of work exploring the applications of maximum entropy characterizations to problems in discrete mathematics.
Recommendations
- Log-concavity and the maximum entropy property of the Poisson distribution
- Concentration functions and entropy bounds for discrete log-concave distributions
- scientific article; zbMATH DE number 7295891
- On the Poincaré constant of log-concave measures
- Ultra log-concavity of discrete order statistics
- Concentration between Lévy's inequality and the Poincaré inequality for log-concave densities
- Concentration inequalities for ultra log-concave distributions
- Poisson processes and a log-concave Bernstein theorem
- Concentration of measure principle and entropy-inequalities
- On some inequalities for entropies of discrete probability distributions
Cites work
- scientific article; zbMATH DE number 2131215 (Why is no real title available?)
- scientific article; zbMATH DE number 3848310 (Why is no real title available?)
- scientific article; zbMATH DE number 3604069 (Why is no real title available?)
- scientific article; zbMATH DE number 568836 (Why is no real title available?)
- scientific article; zbMATH DE number 893785 (Why is no real title available?)
- scientific article; zbMATH DE number 3348831 (Why is no real title available?)
- scientific article; zbMATH DE number 3198427 (Why is no real title available?)
- A conjecture on matroids
- A short proof, based on mixed volumes, of Liggett's theorem on the convolution of ultra-logconcave sequences
- A strong log-concavity property for measures on Boolean algebras
- An entropy proof of Bregman's theorem
- Binomial and Poisson distributions as maximum entropy distributions
- Binomial-Poisson entropic inequalities and the M/M/∞queue
- Compound Poisson approximation for nonnegative random variables via Stein's method
- Compound Poisson approximation via information functionals
- Correlation inequalities on some partially ordered sets
- Discrete analogues of self-decomposability and stability
- Elements of Information Theory
- Entropy and set cardinality inequalities for partition-determined functions
- Entropy and the Law of Small Numbers
- Entropy and the central limit theorem
- Entropy computations via analytic depoissonization
- Entropy, independent sets and antichains: A new approach to Dedekind's problem
- Families of Non-disjoint subsets
- Generalized Entropy Power Inequalities and Monotonicity Properties of Information
- Infinite Divisibility of Integer-Valued Random Variables
- Integral representations and asymptotic expansions for Shannon and Renyi entropies
- Log-concavity and LC-positivity
- Log-concavity and the maximum entropy property of the Poisson distribution
- Log-concavity of characteristic polynomials and the Bergman fan of matroids
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Monotonic Decrease of the Non-Gaussianness of the Sum of Independent Random Variables: A Simple Proof
- Monotonicity and aging properties of random sums
- Negative correlation and log-concavity
- Negative dependence and the geometry of polynomials
- Negatively correlated random variables and Mason's conjecture for independent sets in matroids
- On Dedekind's Problem: The Number of Isotone Boolean Functions. II
- On log-concave and log-convex infinitely divisible sequences and densities
- On the Entropy of Compound Distributions on Nonnegative Integers
- On the maximum entropy of the sum of two dependent random variables
- On the numbers of independent \(k\)-sets in a claw free graph
- Preservation of log-concavity on summation
- Random Geometric Graphs
- Singularity analysis and asymptotics of Bernoulli sums
- Solution of Shannon’s problem on the monotonicity of entropy
- Some Results for Discrete Unimodality
- The Entropy Per Coordinate of a Random Vector is Highly Constrained Under Convexity Conditions
- The number of linear extensions of subset ordering
- The number of linear extensions of the Boolean lattice
- The roots of the independence polynomial of a clawfree graph
- Thinning, Entropy, and the Law of Thin Numbers
- Towards a theory of negative dependence.
- Ultra logconcave sequences and negative dependence
- Uniform stochastic ordering and related inequalities
- Unimodal, log-concave and Pólya frequency sequences in combinatorics
Cited in
(23)- Geometric and functional inequalities for log-concave probability sequences
- Log-concavity and discrete degrees of freedom
- Bernoulli sums and Rényi entropy inequalities
- Entropy and thinning of discrete random variables
- Entropy inequalities for sums in prime cyclic groups
- Majorization and Rényi entropy inequalities via Sperner theory
- Top-heavy phenomena for transformations
- Negative dependence and stochastic orderings
- Strong log-concavity is preserved by convolution
- Information in Probability: Another Information-Theoretic Proof of a Finite de Finetti Theorem
- Log-concavity and the maximum entropy property of the Poisson distribution
- Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
- Log-Hessian and deviation bounds for Markov semi-groups, and regularization effect in \(\mathbb{L}^1 \)
- Log-concavity of compound distributions with applications in stochastic optimization
- Concentration inequalities for ultra log-concave distributions
- Relative log-concavity and a pair of triangle inequalities
- Entropy-variance inequalities for discrete log-concave random variables via degree of freedom
- Clinical site selection problems with probabilistic constraints
- Log-concavity and strong log-concavity: a review
- The Discrete Moment Problem with Nonconvex Shape Constraints
- Tight revenue gaps among multiunit mechanisms
- Efron's monotonicity property for measures on \(\mathbb{R}^2\)
- Entropy and the discrete central limit theorem
This page was built for publication: Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q385153)