On information rates for mismatched decoders
DOI10.1109/18.340469zbMATH Open0826.94017OpenAlexW2154433856MaRDI QIDQ4850296FDOQ4850296
Authors: Neri Merhav, Gideon Kaplan, Amos Lapidoth, S. Shamai
Publication date: 29 November 1995
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://semanticscholar.org/paper/42a2d21000717ce7285221a3fa586831222da897
Recommendations
- Information-Theoretic Foundations of Mismatched Decoding
- A converse coding theorem for mismatched decoding at the output of binary-input memoryless channels
- Mismatched decoding revisited: General alphabets, channels with memory, and the wide-band limit
- The coding capacity of mismatched Gaussian channels (Corresp.)
- scientific article
exponential familiessphere packinggeneralized mutual informationrandom codingmismatched decodinggeneralized cutoff rate
Statistical aspects of information-theoretic topics (62B10) Information theory (general) (94A15) Coding theorems (Shannon theory) (94A24) Channel models (including quantum) in information and communication theory (94A40)
Cited In (9)
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- Mismatched decoding revisited: General alphabets, channels with memory, and the wide-band limit
- Converse Theorems for the DMC With Mismatched Decoding
- Optimal Mutual Information for Coders and Jammers in Mismatched Communication Channels
- On the Information Loss of the Max-Log Approximation in BICM Systems
- A Causal Perspective on the Analysis of Signal and Noise Correlations and Their Role in Population Coding
- Channel capacity for a given decoding metric
- Information-Theoretic Foundations of Mismatched Decoding
- On the relation between encoding and decoding of neuronal spikes
This page was built for publication: On information rates for mismatched decoders
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4850296)