An Operational Characterization of Mutual Information in Algorithmic Information Theory

From MaRDI portal
Publication:5215467

DOI10.1145/3356867zbMATH Open1473.68100arXiv1710.05984OpenAlexW2963148337MaRDI QIDQ5215467FDOQ5215467

Andrei Romashchenko, Marius Zimand

Publication date: 11 February 2020

Published in: Journal of the ACM (Search for Journal in Brave)

Abstract: We show that the mutual information, in the sense of Kolmogorov complexity, of any pair of strings x and y is equal, up to logarithmic precision, to the length of the longest shared secret key that two parties, one having x and the complexity profile of the pair and the other one having y and the complexity profile of the pair, can establish via a probabilistic protocol with interaction on a public channel. For ell>2, the longest shared secret that can be established from a tuple of strings (x1,ldots,xell) by ell parties, each one having one component of the tuple and the complexity profile of the tuple, is equal, up to logarithmic precision, to the complexity of the tuple minus the minimum communication necessary for distributing the tuple to all parties. We establish the communication complexity of secret key agreement protocols that produce a secret key of maximal length, for protocols with public randomness. We also show that if the communication complexity drops below the established threshold, then only very short secret keys can be obtained.


Full work available at URL: https://arxiv.org/abs/1710.05984






Cited In (4)






This page was built for publication: An Operational Characterization of Mutual Information in Algorithmic Information Theory

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5215467)