An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
DOI10.1109/TIT.2007.894680zbMATH Open1319.94026arXivcs/0604025MaRDI QIDQ3548895FDOQ3548895
Publication date: 21 December 2008
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://arxiv.org/abs/cs/0604025
Fisher informationdistributed source codingDifferential entropyentropy-power inequality (EPI)vector Gaussian broadcast channel
Information theory (general) (94A15) Channel models (including quantum) in information and communication theory (94A40)
Cited In (6)
- The conditional entropy power inequality for Gaussian quantum states
- On the tightness of the Zhang-Yeung inequality for Gaussian vectors
- A DE BRUIJN'S IDENTITY FOR DEPENDENT RANDOM VARIABLES BASED ON COPULA THEORY
- On the maximum entropy of the sum of two dependent random variables
- An extension of entropy power inequality for dependent random variables
- A new entropy power inequality
Recommendations
- A new entropy power inequality π π
- On characterization of entropy function via information inequalities π π
- A generalization of the entropy power inequality with applications π π
- A Conditional Entropy Power Inequality for Dependent Variables π π
- On a new non-Shannon type information inequality π π
This page was built for publication: An Extremal Inequality Motivated by Multiterminal Information-Theoretic Problems
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q3548895)