On the Maximum Entropy of a Sum of Independent Discrete Random Variables

From MaRDI portal
Publication:5163530

DOI10.1137/S0040585X97T99054XzbMATH Open1479.60029arXiv2008.01138OpenAlexW3209910216MaRDI QIDQ5163530FDOQ5163530

Author name not available (Why is that?)

Publication date: 4 November 2021

Published in: Theory of Probability & Its Applications (Search for Journal in Brave)

Abstract: Let X1,ldots,Xn be independent random variables taking values in the alphabet 0,1,ldots,r, and Sn=sumi=1nXi. The Shepp--Olkin theorem states that, in the binary case (r=1), the Shannon entropy of Sn is maximized when all the Xi's are uniformly distributed, i.e., Bernoulli(1/2). In an attempt to generalize this theorem to arbitrary finite alphabets, we obtain a lower bound on the maximum entropy of Sn and prove that it is tight in several special cases. In addition to these special cases, an argument is presented supporting the conjecture that the bound represents the optimal value for all n,r, i.e., that H(Sn) is maximized when X1,ldots,Xn1 are uniformly distributed over 0,r, while the probability mass function of Xn is a mixture (with explicitly defined non-zero weights) of the uniform distributions over 0,r and 1,ldots,r1.


Full work available at URL: https://arxiv.org/abs/2008.01138




Recommendations




Cites Work


Cited In (8)





This page was built for publication: On the Maximum Entropy of a Sum of Independent Discrete Random Variables

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5163530)