Sparse regression codes

From MaRDI portal
Publication:5228469

DOI10.1561/0100000092zbMATH Open1429.94047arXiv1911.00771OpenAlexW2951974280MaRDI QIDQ5228469FDOQ5228469


Authors: Ramji Venkataramanan, Sekhar Tatikonda, Andrew R. Barron Edit this on Wikidata


Publication date: 12 August 2019

Published in: Foundations and Trends™ in Communications and Information Theory (Search for Journal in Brave)

Abstract: Developing computationally-efficient codes that approach the Shannon-theoretic limits for communication and compression has long been one of the major goals of information and coding theory. There have been significant advances towards this goal in the last couple of decades, with the emergence of turbo codes, sparse-graph codes, and polar codes. These codes are designed primarily for discrete-alphabet channels and sources. For Gaussian channels and sources, where the alphabet is inherently continuous, Sparse Superposition Codes or Sparse Regression Codes (SPARCs) are a promising class of codes for achieving the Shannon limits. This survey provides a unified and comprehensive overview of sparse regression codes, covering theory, algorithms, and practical implementation aspects. The first part of the monograph focuses on SPARCs for AWGN channel coding, and the second part on SPARCs for lossy compression (with squared error distortion criterion). In the third part, SPARCs are used to construct codes for Gaussian multi-terminal channel and source coding models such as broadcast channels, multiple-access channels, and source and channel coding with side information. The survey concludes with a discussion of open problems and directions for future work.


Full work available at URL: https://arxiv.org/abs/1911.00771




Recommendations





Cited In (2)





This page was built for publication: Sparse regression codes

Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q5228469)