The Lossy Common Information of Correlated Sources
From MaRDI portal
Publication:2986320
DOI10.1109/TIT.2014.2315805zbMATH Open1360.94173arXiv1403.8093OpenAlexW2117256830MaRDI QIDQ2986320FDOQ2986320
Authors: Kumar B. Viswanatha, Emrah Akyol, Kenneth Rose
Publication date: 16 May 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: The two most prevalent notions of common information (CI) are due to Wyner and Gacs-Korner and both the notions can be stated as two different characteristic points in the lossless Gray-Wyner region. Although the information theoretic characterizations for these two CI quantities can be easily evaluated for random variables with infinite entropy (eg., continuous random variables), their operational significance is applicable only to the lossless framework. The primary objective of this paper is to generalize these two CI notions to the lossy Gray-Wyner network, which hence extends the theoretical foundation to general sources and distortion measures. We begin by deriving a single letter characterization for the lossy generalization of Wyner's CI, defined as the minimum rate on the shared branch of the Gray-Wyner network, maintaining minimum sum transmit rate when the two decoders reconstruct the sources subject to individual distortion constraints. To demonstrate its use, we compute the CI of bivariate Gaussian random variables for the entire regime of distortions. We then similarly generalize Gacs and Korner's definition to the lossy framework. The latter half of the paper focuses on studying the tradeoff between the total transmit rate and receive rate in the Gray-Wyner network. We show that this tradeoff yields a contour of points on the surface of the Gray-Wyner region, which passes through both the Wyner and Gacs-Korner operating points, and thereby provides a unified framework to understand the different notions of CI. We further show that this tradeoff generalizes the two notions of CI to the excess sum transmit rate and receive rate regimes, respectively.
Full work available at URL: https://arxiv.org/abs/1403.8093
Recommendations
- A Lossy Source Coding Interpretation of Wyner’s Common Information
- Total correlations and mutual information
- Lossy Coding of Correlated Sources Over a Multiple Access Channel: Necessary Conditions and Separation Results
- Common Information, Noise Stability, and Their Extensions
- A counterexample in rate-distortion theory for correlated sources
- Informationally optimal correlation
- Correlation distance and bounds for mutual information
- scientific article; zbMATH DE number 4006142
- Correlation between concurrence and mutual information
Cited In (3)
This page was built for publication: The Lossy Common Information of Correlated Sources
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986320)