TY - GEN
T1 - Shannon's formula and Hartley's rule
T2 - 34th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering, MaxEnt 2014
AU - Rioul, Olivier
AU - Magossi, José Carlos
N1 - Publisher Copyright:
© 2015 AIP Publishing LLC.
PY - 2015/1/1
Y1 - 2015/1/1
N2 - Shannon's formula C=1\2log(1+P/N) is the emblematic expression for the information capacity of a communication channel. Hartley's name is often associated with it, owing to Hartley's rule: counting the highest possible number of distinguishable values for a given amplitude A and precision ±Δ yields a similar expression C′=log(1+A/Δ). In the information theory community, the following "historical" statements are generally well accepted: (1) Hartley put forth his rule twenty years before Shannon; (2) Shannon's formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came unexpected in 1948; (3) Hartley's rule is an imprecise relation while Shannon's formula is exact; (4) Hartley's expression is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are questionable, if not wrong.
AB - Shannon's formula C=1\2log(1+P/N) is the emblematic expression for the information capacity of a communication channel. Hartley's name is often associated with it, owing to Hartley's rule: counting the highest possible number of distinguishable values for a given amplitude A and precision ±Δ yields a similar expression C′=log(1+A/Δ). In the information theory community, the following "historical" statements are generally well accepted: (1) Hartley put forth his rule twenty years before Shannon; (2) Shannon's formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came unexpected in 1948; (3) Hartley's rule is an imprecise relation while Shannon's formula is exact; (4) Hartley's expression is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are questionable, if not wrong.
KW - Hartley's rule
KW - Shannon's formula
KW - additive noise channel
KW - additive white Gaussian noise (AWGN)
KW - channel capacity
KW - differential entropy
KW - signal-to-noise ratio
KW - uniform noise channel
U2 - 10.1063/1.4905969
DO - 10.1063/1.4905969
M3 - Conference contribution
AN - SCOPUS:85063848769
T3 - AIP Conference Proceedings
SP - 105
EP - 112
BT - Bayesian Inference and Maximum Entropy Methods in Science and Engineering, MaxEnt 2014
A2 - Mohammad-Djafari, Ali
A2 - Barbaresco, Frederic
A2 - Barbaresco, Frederic
PB - American Institute of Physics Inc.
Y2 - 21 September 2014 through 26 September 2014
ER -