On information rates for mismatched decoders
From MaRDI portal
Publication:4850296
Recommendations
- Information-Theoretic Foundations of Mismatched Decoding
- A converse coding theorem for mismatched decoding at the output of binary-input memoryless channels
- Mismatched decoding revisited: General alphabets, channels with memory, and the wide-band limit
- The coding capacity of mismatched Gaussian channels (Corresp.)
- scientific article; zbMATH DE number 4213904
Cited in
(10)- Mismatched decoding revisited: General alphabets, channels with memory, and the wide-band limit
- Information-Theoretic Foundations of Mismatched Decoding
- A distance between channels: the average error of mismatched channels
- Optimal Mutual Information for Coders and Jammers in Mismatched Communication Channels
- Channel capacity for a given decoding metric
- Converse Theorems for the DMC With Mismatched Decoding
- Tight Data-Robust Bounds to Mutual Information Combining Shuffling and Model Selection Techniques
- On the relation between encoding and decoding of neuronal spikes
- A causal perspective on the analysis of signal and noise correlations and their role in population coding
- On the Information Loss of the Max-Log Approximation in BICM Systems
This page was built for publication: On information rates for mismatched decoders
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4850296)