On Shannon's formula and Hartley's rule: beyond the mathematical coincidence (Q296309): Difference between revisions

From MaRDI portal
Set OpenAlex properties.
ReferenceBot (talk | contribs)
Changed an Item
 
Property / cites work
 
Property / cites work: A Mathematical Theory of Communication / rank
 
Normal rank
Property / cites work
 
Property / cites work: Multiplex signal transmission and the development of sampling techniques: the work of Herbert Raabe in contrast to that of Claude Shannon / rank
 
Normal rank
Property / cites work
 
Property / cites work: A history of the theory of information / rank
 
Normal rank
Property / cites work
 
Property / cites work: The early days of information theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Claude E. Shannon: a retrospective on his life, work, and impact / rank
 
Normal rank
Property / cites work
 
Property / cites work: Fifty years of Shannon theory / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information theory in the fifties / rank
 
Normal rank
Property / cites work
 
Property / cites work: Information Theoretic Proofs of Entropy Power Inequalities / rank
 
Normal rank
Property / cites work
 
Property / cites work: On the asymptotic convergence of B-spline wavelets to Gabor functions / rank
 
Normal rank
Property / cites work
 
Property / cites work: Entropy and the central limit theorem / rank
 
Normal rank

Latest revision as of 04:36, 12 July 2024

scientific article
Language Label Description Also known as
English
On Shannon's formula and Hartley's rule: beyond the mathematical coincidence
scientific article

    Statements

    On Shannon's formula and Hartley's rule: beyond the mathematical coincidence (English)
    0 references
    0 references
    0 references
    0 references
    0 references
    15 June 2016
    0 references
    Summary: In the information theory community, the following ``historical'' statements are generally well accepted: (1) Hartley did put forth his rule twenty years before Shannon; (2) Shannon's formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came out unexpected in 1948; (3) Hartley's rule is inexact while Shannon's formula is characteristic of the additive white Gaussian noise channel; (4) Hartley's rule is an imprecise relation that is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are somewhat wrong. In fact, a careful calculation shows that ``Hartley's rule'' in fact coincides with Shannon's formula. We explain this mathematical coincidence by deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon's formula and construct a sequence of such channels that makes the link between the uniform (Hartley) and Gaussian (Shannon) channels.
    0 references
    0 references
    0 references