An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
From MaRDI portal
Publication:3548895
Abstract: We prove a new extremal inequality, motivated by the vector Gaussian broadcast channel and the distributed source coding with a single quadratic distortion constraint problems. As a corollary, this inequality yields a generalization of the classical entropy-power inequality (EPI). As another corollary, this inequality sheds insight into maximizing the differential entropy of the sum of two dependent random variables.
Recommendations
Cited in
(6)- An extension of entropy power inequality for dependent random variables
- On the maximum entropy of the sum of two dependent random variables
- A new entropy power inequality
- On the tightness of the Zhang-Yeung inequality for Gaussian vectors
- A de Bruijn's identity for dependent random variables based on copula theory
- The conditional entropy power inequality for Gaussian quantum states
This page was built for publication: An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3548895)