On the maximum entropy of the sum of two dependent random variables
From MaRDI portal
Publication:4324193
DOI10.1109/18.335945zbMath0811.94016OpenAlexW2071728535MaRDI QIDQ4324193
Publication date: 1 March 1995
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.335945
Related Items (7)
Log-concavity, ultra-log-concavity, and a maximum entropy property of discrete compound Poisson measures ⋮ Stability of Cramer’s Characterization of Normal Laws in Information Distances ⋮ A discrete complement of Lyapunov's inequality and its information theoretic consequences ⋮ Further Investigations of Rényi Entropy Power Inequalities and an Entropic Characterization of s-Concave Densities ⋮ Rényi entropy power inequality and a reverse ⋮ Hodge index inequality in geometry and arithmetic: a probabilistic approach ⋮ A reverse entropy power inequality for log-concave random vectors
This page was built for publication: On the maximum entropy of the sum of two dependent random variables