On the Maximum Entropy of a Sum of Independent Discrete Random Variables

From MaRDI portal
Publication:5163530




Abstract: Let X1,ldots,Xn be independent random variables taking values in the alphabet 0,1,ldots,r, and Sn=sumi=1nXi. The Shepp--Olkin theorem states that, in the binary case (r=1), the Shannon entropy of Sn is maximized when all the Xi's are uniformly distributed, i.e., Bernoulli(1/2). In an attempt to generalize this theorem to arbitrary finite alphabets, we obtain a lower bound on the maximum entropy of Sn and prove that it is tight in several special cases. In addition to these special cases, an argument is presented supporting the conjecture that the bound represents the optimal value for all n,r, i.e., that H(Sn) is maximized when X1,ldots,Xn1 are uniformly distributed over 0,r, while the probability mass function of Xn is a mixture (with explicitly defined non-zero weights) of the uniform distributions over 0,r and 1,ldots,r1.









This page was built for publication: On the Maximum Entropy of a Sum of Independent Discrete Random Variables

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5163530)