On the Maximum Entropy of a Sum of Independent Discrete Random Variables
From MaRDI portal
Publication:5163530
DOI10.1137/S0040585X97T99054XzbMATH Open1479.60029arXiv2008.01138OpenAlexW3209910216MaRDI QIDQ5163530FDOQ5163530
Author name not available (Why is that?)
Publication date: 4 November 2021
Published in: Theory of Probability & Its Applications (Search for Journal in Brave)
Abstract: Let be independent random variables taking values in the alphabet , and . The Shepp--Olkin theorem states that, in the binary case (), the Shannon entropy of is maximized when all the 's are uniformly distributed, i.e., Bernoulli(1/2). In an attempt to generalize this theorem to arbitrary finite alphabets, we obtain a lower bound on the maximum entropy of and prove that it is tight in several special cases. In addition to these special cases, an argument is presented supporting the conjecture that the bound represents the optimal value for all , i.e., that is maximized when are uniformly distributed over , while the probability mass function of is a mixture (with explicitly defined non-zero weights) of the uniform distributions over and .
Full work available at URL: https://arxiv.org/abs/2008.01138
Recommendations
- On the maximum entropy of the sum of two dependent random variables
- Maximizing the entropy of a sum of independent bounded random variables
- Mutual dependence of random variables and maximum discretized entropy
- Entropy Bounds for Discrete Random Variables via Maximal Coupling
- scientific article; zbMATH DE number 3848310
- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- scientific article; zbMATH DE number 1772494
- On some inequalities for entropies of discrete probability distributions
- Discrete approximations of continuous distributions by maximum entropy
Probability distributions: general theory (60E05) Inequalities; stochastic orderings (60E15) Measures of information, entropy (94A17)
Cites Work
- Discrete versions of the transport equation and the Shepp-Olkin conjecture
- Title not available (Why is that?)
- On the Maximum Entropy Properties of the Binomial Distribution
- Title not available (Why is that?)
- Log-concavity and the maximum entropy property of the Poisson distribution
- Binomial and Poisson distributions as maximum entropy distributions
- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- On the Entropy of the Multinomial Distribution
- A proof of the Shepp-Olkin entropy monotonicity conjecture
- A proof of the Shepp-Olkin entropy concavity conjecture
- Maximizing the entropy of a sum of independent bounded random variables
Cited In (8)
- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- Statistical estimation of the entropy of discrete random variables with a large number of outcomes
- Maximum entropy and integer partitions
- Maximum-entropy distributions of nonnegative random variables with conditional-additive structure
- Title not available (Why is that?)
- An upper bound for entropy of discrete distributions having assigned moments
- Remarks on the Rényi Entropy of a Sum of IID Random Variables
- Maximal correlation and monotonicity of free entropy and of Stein discrepancy
This page was built for publication: On the Maximum Entropy of a Sum of Independent Discrete Random Variables
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5163530)