Rioul, Olivier; Magossi, José Carlos Shannon’s formula and Hartley’s rule: a mathematical coincidence? (English) Zbl 1483.62065 Mohammad-Djafari, Ali (ed.) et al., Bayesian inference and maximum entropy methods in science and engineering (MaxEnt 2014), Clos Lucé, Amboise, France, September 21–26, 2014. Melville, NY: American Institute of Physics (AIP). AIP Conf. Proc. 1641, 105-112 (2015). Summary: Shannon’s formula \(C = \frac{1}{2} \log(1+P/N)\) is the emblematic expression for the information capacity of a communication channel. Hartley’s name is often associated with it, owing to Hartley’s rule: counting the highest possible number of distinguishable values for a given amplitude \(A\) and precision \(\pm \Delta\) yields a similar expression \(C' = \log(1+A/\Delta) \). In the information theory community, the following “historical” statements are generally well accepted: (1) Hartley put forth his rule twenty years before Shannon; (2) Shannon’s formula as a fundamental tradeoff between transmission rate, bandwidth, and signal-to-noise ratio came unexpected in 1948; (3) Hartley’s rule is an imprecise relation while Shannon’s formula is exact; (4) Hartley’s expression is not an appropriate formula for the capacity of a communication channel. We show that all these four statements are questionable, if not wrong.For the entire collection see [Zbl 1470.00021]. MSC: 62F15 Bayesian inference 62-03 History of statistics 94A24 Coding theorems (Shannon theory) PDFBibTeX XMLCite \textit{O. Rioul} and \textit{J. C. Magossi}, AIP Conf. Proc. 1641, 105--112 (2015; Zbl 1483.62065) Full Text: DOI HAL