On shannon's formula and Hartley's rule: Beyond the mathematical coincidence

Olivier Rioul, José Carlos Magossi

Research output: Contribution to journalArticlepeer-review

Abstract

In the information theory community, the following “historical“ statements are generally well accepted: (1) Hartley did put forth his rule twenty years before Shannon; (2) Shannon's formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came out unexpected in 1948; (3) Hartley's rule is inexact while Shannon's formula is characteristic of the additive white Gaussian noise channel; (4) Hartley's rule is an imprecise relation that is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are somewhat wrong. In fact, a careful calculation shows that “Hartley's rule“ in fact coincides with Shannon's formula. We explain this mathematical coincidence by deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon's formula and construct a sequence of such channels that makes the link between the uniform (Hartley) and Gaussian (Shannon) channels.

Original languageEnglish
Pages (from-to)4892-4910
Number of pages19
JournalEntropy
Volume16
Issue number9
DOIs
Publication statusPublished - 1 Jan 2014
Externally publishedYes

Keywords

  • Additive noise channel
  • Additive white Gaussian noise (AWGN) channel
  • Central limit theorem
  • Channel capacity
  • Characteristic function
  • Differential entropy
  • Hartley's rule
  • Pulse-amplitude modulation (PAM)
  • Shannon's formula
  • Signal-to-noise ratio
  • Uniform B-spline function
  • Uniform noise channel
  • Uniform sum distribution

Fingerprint

Dive into the research topics of 'On shannon's formula and Hartley's rule: Beyond the mathematical coincidence'. Together they form a unique fingerprint.

Cite this