A general formula for channel capacity
DOI10.1109/18.335960zbMATH Open0819.94016OpenAlexW2020347709MaRDI QIDQ4324176FDOQ4324176
Publication date: 1 March 1995
Published in: IEEE Transactions on Information Theory (Search for Journal in Brave)
Full work available at URL: https://doi.org/10.1109/18.335960
Recommendations
error probabilitychannel capacityShannon theorystrong conversechannel coding theoremchannels with memory\(m\)-ary hypothesis tests
Parametric hypothesis testing (62F03) Coding theorems (Shannon theory) (94A24) Channel models (including quantum) in information and communication theory (94A40)
Cited In (17)
- Asymptotic convertibility of entanglement: An information-spectrum approach to entanglement concentration and dilution
- Three-terminal communication channels
- The weak capacity of averaged channels
- Extension of some results for channel capacity using a generalized information measure
- Design of Information Channels for Optimization and Stabilization in Networked Control
- Distributions and channel capacities in generalized statistical mechanics
- Title not available (Why is that?)
- Universal coding for classical-quantum channel
- Efficient information transfer by Poisson neurons
- A General Formula for the Mismatch Capacity
- New viewpoint on communication channel capabilities
- Second-order asymptotics for quantum hypothesis testing
- Title not available (Why is that?)
- Second-order converses via reverse hypercontractivity
- The law of large numbers for the capacity of memoryless channels with a random transition matrix
- Common Information, Noise Stability, and Their Extensions
- A general formula for the capacity of stationary nonanticipatory channels
This page was built for publication: A general formula for channel capacity
Report a bug (only for logged in users!)Click here to report a bug for this page (MaRDI item Q4324176)