Abstract
In the information theory community, the following “historical“ statements are generally well accepted: (1) Hartley did put forth his rule twenty years before Shannon; (2) Shannon's formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came out unexpected in 1948; (3) Hartley's rule is inexact while Shannon's formula is characteristic of the additive white Gaussian noise channel; (4) Hartley's rule is an imprecise relation that is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are somewhat wrong. In fact, a careful calculation shows that “Hartley's rule“ in fact coincides with Shannon's formula. We explain this mathematical coincidence by deriving the necessary and sufficient conditions on an additive noise channel such that its capacity is given by Shannon's formula and construct a sequence of such channels that makes the link between the uniform (Hartley) and Gaussian (Shannon) channels.
| Original language | English |
|---|---|
| Pages (from-to) | 4892-4910 |
| Number of pages | 19 |
| Journal | Entropy |
| Volume | 16 |
| Issue number | 9 |
| DOIs | |
| Publication status | Published - 1 Jan 2014 |
| Externally published | Yes |
Keywords
- Additive noise channel
- Additive white Gaussian noise (AWGN) channel
- Central limit theorem
- Channel capacity
- Characteristic function
- Differential entropy
- Hartley's rule
- Pulse-amplitude modulation (PAM)
- Shannon's formula
- Signal-to-noise ratio
- Uniform B-spline function
- Uniform noise channel
- Uniform sum distribution