The common information of two dependent random variables

From MaRDI portal
Publication:4054914

DOI10.1109/TIT.1975.1055346zbMath0299.94014MaRDI QIDQ4054914

Aaron D. Wyner

Publication date: 1975

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)




Related Items (21)

Self-scaled bounds for atomic cone ranks: applications to nonnegative rank and cp-rankCommon randomness and distributed control: A counterexampleCommon information and unique disjointnessCommon, Correlated, and Private Information in Control of Decentralized SystemsSecure non-interactive simulation: feasibility and rateSecure non-interactive reduction and spectral analysis of correlationsOptimizing quantum models of classical channels: the reverse Holevo problemNonlocal Games with Noisy Maximally Entangled States are DecidableSecure non-interactive simulation from arbitrary joint distributionsUniversal Features for High-Dimensional Learning and InferenceMultiple-user communicationSecure non-interactive reducibility is decidableArbitrarily small amounts of correlation for arbitrarily varying quantum channelsInformation-theoretic approximations of the nonnegative rankMaximal rectangular subsets contained in the set of partially jointly typical sequences for dependent random variablesInformation theories with adversaries, intrinsic information, and entanglementDimension Reduction for Polynomials over Gaussian Space and ApplicationsAn Upper Bound on the Sizes of Multiset-Union-Free FamiliesTropical probability theory and an application to the entropic coneCommon Information, Noise Stability, and Their ExtensionsOver-the-Air computation for distributed machine learning and consensus in large wireless networks




This page was built for publication: The common information of two dependent random variables