Distributed Compression through the Lens of Algorithmic Information Theory: A Primer

From MaRDI portal
Publication:5222997

DOI10.1142/9789813237315_0004zbMATH Open1415.68114arXiv1706.08468OpenAlexW2687652740MaRDI QIDQ5222997FDOQ5222997

Marius Zimand

Publication date: 4 July 2019

Published in: Mathematics Almost Everywhere (Search for Journal in Brave)

Abstract: Distributed compression is the task of compressing correlated data by several parties, each one possessing one piece of data and acting separately. The classical Slepian-Wolf theorem (D. Slepian, J. K. Wolf, IEEE Transactions on Inf. Theory, 1973) shows that if data is generated by independent draws from a joint distribution, that is by a memoryless stochastic process, then distributed compression can achieve the same compression rates as centralized compression when the parties act together. Recently, the author (M. Zimand, STOC 2017) has obtained an analogue version of the Slepian-Wolf theorem in the framework of Algorithmic Information Theory (also known as Kolmogorov complexity). The advantage over the classical theorem, is that the AIT version works for individual strings, without any assumption regarding the generative process. The only requirement is that the parties know the complexity profile of the input strings, which is a simple quantitative measure of the data correlation. The goal of this paper is to present in an accessible form that omits some technical details the main ideas from the reference (M. Zimand, STOC 2017).


Full work available at URL: https://arxiv.org/abs/1706.08468




Recommendations




Cited In (4)





This page was built for publication: Distributed Compression through the Lens of Algorithmic Information Theory: A Primer

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5222997)