Network Compression: Worst Case Analysis

From MaRDI portal
Publication:2977385

DOI10.1109/TIT.2015.2434829zbMATH Open1359.94986arXiv1304.1828WikidataQ49195937 ScholiaQ49195937MaRDI QIDQ2977385FDOQ2977385


Authors: Himanshu Asnani, Ilan Shomorony, A. Salman Avestimehr, Tsachy Weissman Edit this on Wikidata


Publication date: 28 April 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: We study the problem of communicating a distributed correlated memoryless source over a memoryless network, from source nodes to destination nodes, under quadratic distortion constraints. We establish the following two complementary results: (a) for an arbitrary memoryless network, among all distributed memoryless sources of a given correlation, Gaussian sources are least compressible, that is, they admit the smallest set of achievable distortion tuples, and (b) for any memoryless source to be communicated over a memoryless additive-noise network, among all noise processes of a given correlation, Gaussian noise admits the smallest achievable set of distortion tuples. We establish these results constructively by showing how schemes for the corresponding Gaussian problems can be applied to achieve similar performance for (source or noise) distributions that are not necessarily Gaussian but have the same covariance.


Full work available at URL: https://arxiv.org/abs/1304.1828







Cited In (1)





This page was built for publication: Network Compression: Worst Case Analysis

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2977385)