The Dispersion of Nearest-Neighbor Decoding for Additive Non-Gaussian Channels

From MaRDI portal
Publication:2979078

DOI10.1109/TIT.2016.2620161zbMATH Open1359.94869arXiv1512.06618OpenAlexW2596284900MaRDI QIDQ2979078FDOQ2979078


Authors: Jonathan Scarlett, Vincent Y. F. Tan, Giuseppe Durisi Edit this on Wikidata


Publication date: 2 May 2017

Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)

Abstract: We study the second-order asymptotics of information transmission using random Gaussian codebooks and nearest neighbor (NN) decoding over a power-limited stationary memoryless additive non-Gaussian noise channel. We show that the dispersion term depends on the non-Gaussian noise only through its second and fourth moments, thus complementing the capacity result (Lapidoth, 1996), which depends only on the second moment. Furthermore, we characterize the second-order asymptotics of point-to-point codes over K-sender interference networks with non-Gaussian additive noise. Specifically, we assume that each user's codebook is Gaussian and that NN decoding is employed, i.e., that interference from the K1 unintended users (Gaussian interfering signals) is treated as noise at each decoder. We show that while the first-order term in the asymptotic expansion of the maximum number of messages depends on the power of the interferring codewords only through their sum, this does not hold for the second-order term.


Full work available at URL: https://arxiv.org/abs/1512.06618




Recommendations




Cited In (1)





This page was built for publication: The Dispersion of Nearest-Neighbor Decoding for Additive Non-Gaussian Channels

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q2979078)