Network Coding Capacity Regions via Entropy Functions

From MaRDI portal
Publication:2986152

DOI10.1109/TIT.2014.2334291zbMATH Open1360.94194arXiv1201.1062OpenAlexW2963283980MaRDI QIDQ2986152FDOQ2986152

Alex Grant, Terence H. Chan

Publication date: 16 May 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: In this paper, we use entropy functions to characterise the set of rate-capacity tuples achievable with either zero decoding error, or vanishing decoding error, for general network coding problems. We show that when sources are colocated, the outer bound obtained by Yeung, A First Course in Information Theory, Section 15.5 (2002) is tight and the sets of zero-error achievable and vanishing-error achievable rate-capacity tuples are the same. We also characterise the set of zero-error and vanishing-error achievable rate capacity tuples for network coding problems subject to linear encoding constraints, routing constraints (where some or all nodes can only perform routing) and secrecy constraints. Finally, we show that even for apparently simple networks, design of optimal codes may be difficult. In particular, we prove that for the incremental multicast problem and for the single-source secure network coding problem, characterisation of the achievable set is very hard and linear network codes may not be optimal.


Full work available at URL: https://arxiv.org/abs/1201.1062







Cited In (1)





This page was built for publication: Network Coding Capacity Regions via Entropy Functions

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2986152)