The Log-Volume of Optimal Codes for Memoryless Channels, Asymptotically Within a Few Nats
From MaRDI portal
Publication:5280896
DOI10.1109/TIT.2016.2643681zbMATH Open1366.94202arXiv1311.0181OpenAlexW2962912704MaRDI QIDQ5280896FDOQ5280896
Authors: Pierre Moulin
Publication date: 27 July 2017
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Abstract: Shannon's analysis of the fundamental capacity limits for memoryless communication channels has been refined over time. In this paper, the maximum volume of length- codes subject to an average decoding error probability is shown to satisfy the following tight asymptotic lower and upper bounds as : [ underline{A}_epsilon + o(1) le log M_avg^*(n,epsilon) - [nC - sqrt{nV_epsilon} ,Q^{-1}(epsilon) + frac{1}{2} log n] le overline{A}_epsilon + o(1) ] where is the Shannon capacity, the -channel dispersion, or second-order coding rate, the tail probability of the normal distribution, and the constants and are explicitly identified. This expression holds under mild regularity assumptions on the channel, including nonsingularity. The gap is one nat for weakly symmetric channels in the Cover-Thomas sense, and typically a few nats for other symmetric channels, for the binary symmetric channel, and for the channel. The derivation is based on strong large-deviations analysis and refined central limit asymptotics. A random coding scheme that achieves the lower bound is presented. The codewords are drawn from a capacity-achieving input distribution modified by an correction term.
Full work available at URL: https://arxiv.org/abs/1311.0181
Measures of information, entropy (94A17) Channel models (including quantum) in information and communication theory (94A40)
Cited In (2)
This page was built for publication: The Log-Volume of Optimal Codes for Memoryless Channels, Asymptotically Within a Few Nats
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5280896)