On the Maximum Entropy of a Sum of Independent Discrete Random Variables
From MaRDI portal
Publication:5163530
Abstract: Let be independent random variables taking values in the alphabet , and . The Shepp--Olkin theorem states that, in the binary case (), the Shannon entropy of is maximized when all the 's are uniformly distributed, i.e., Bernoulli(1/2). In an attempt to generalize this theorem to arbitrary finite alphabets, we obtain a lower bound on the maximum entropy of and prove that it is tight in several special cases. In addition to these special cases, an argument is presented supporting the conjecture that the bound represents the optimal value for all , i.e., that is maximized when are uniformly distributed over , while the probability mass function of is a mixture (with explicitly defined non-zero weights) of the uniform distributions over and .
Recommendations
- On the maximum entropy of the sum of two dependent random variables
- Maximizing the entropy of a sum of independent bounded random variables
- Mutual dependence of random variables and maximum discretized entropy
- Entropy Bounds for Discrete Random Variables via Maximal Coupling
- scientific article; zbMATH DE number 3848310
- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- scientific article; zbMATH DE number 1772494
- On some inequalities for entropies of discrete probability distributions
- Discrete approximations of continuous distributions by maximum entropy
Cites work
- scientific article; zbMATH DE number 3848310 (Why is no real title available?)
- scientific article; zbMATH DE number 51089 (Why is no real title available?)
- A proof of the Shepp-Olkin entropy concavity conjecture
- A proof of the Shepp-Olkin entropy monotonicity conjecture
- Binomial and Poisson distributions as maximum entropy distributions
- Discrete versions of the transport equation and the Shepp-Olkin conjecture
- Log-concavity and the maximum entropy property of the Poisson distribution
- Maximizing the entropy of a sum of independent bounded random variables
- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- On the Entropy of the Multinomial Distribution
- On the Maximum Entropy Properties of the Binomial Distribution
Cited in
(11)- Maximum Entropy for Sums of Symmetric and Bounded Random Variables: A Short Derivation
- Statistical estimation of the entropy of discrete random variables with a large number of outcomes
- Maximum entropy and integer partitions
- Concavity of entropy along binomial convolutions
- Maximum-entropy distributions of nonnegative random variables with conditional-additive structure
- scientific article; zbMATH DE number 1103094 (Why is no real title available?)
- An upper bound for entropy of discrete distributions having assigned moments
- Remarks on the Rényi Entropy of a Sum of IID Random Variables
- A proof of the Shepp-Olkin entropy concavity conjecture
- Maximal correlation and monotonicity of free entropy and of Stein discrepancy
- Maximizing the entropy of a sum of independent bounded random variables
This page was built for publication: On the Maximum Entropy of a Sum of Independent Discrete Random Variables
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5163530)