TY - GEN
T1 - A Historical Perspective on Schützenberger-Pinsker Inequalities
AU - Rioul, Olivier
N1 - Publisher Copyright:
© 2023, The Author(s), under exclusive license to Springer Nature Switzerland AG.
PY - 2023/1/1
Y1 - 2023/1/1
N2 - This paper presents a tutorial overview of so-called Pinsker inequalities which establish a precise relationship between information and statistics, and whose use have become ubiquitous in many information theoretic applications. According to Stigler’s law of eponymy, no scientific discovery is named after its original discoverer. Pinsker’s inequality is no exception: Years before the publication of Pinsker’s book in 1960, the French medical doctor, geneticist, epidemiologist, and mathematician Marcel-Paul (Marco) Schützenberger, in his 1953 doctoral thesis, not only proved what is now called Pinsker’s inequality (with the optimal constant that Pinsker himself did not establish) but also the optimal second-order improvement, more than a decade before Kullback’s derivation of the same inequality. We review Schûtzenberger and Pinsker contributions as well as those of Volkonskii & Rozanov, Sakaguchi, McKean, Csiszár, Kullback, Kemperman, Vajda, Bretagnolle & Huber, Krafft & Schmitz, Toussaint, Reid & Williamson, Gilardoni, as well as the optimal derivation of Fedotov, Harremoës, & Topsøe.
AB - This paper presents a tutorial overview of so-called Pinsker inequalities which establish a precise relationship between information and statistics, and whose use have become ubiquitous in many information theoretic applications. According to Stigler’s law of eponymy, no scientific discovery is named after its original discoverer. Pinsker’s inequality is no exception: Years before the publication of Pinsker’s book in 1960, the French medical doctor, geneticist, epidemiologist, and mathematician Marcel-Paul (Marco) Schützenberger, in his 1953 doctoral thesis, not only proved what is now called Pinsker’s inequality (with the optimal constant that Pinsker himself did not establish) but also the optimal second-order improvement, more than a decade before Kullback’s derivation of the same inequality. We review Schûtzenberger and Pinsker contributions as well as those of Volkonskii & Rozanov, Sakaguchi, McKean, Csiszár, Kullback, Kemperman, Vajda, Bretagnolle & Huber, Krafft & Schmitz, Toussaint, Reid & Williamson, Gilardoni, as well as the optimal derivation of Fedotov, Harremoës, & Topsøe.
KW - Data processing inequality
KW - Kullback-Leibler divergence
KW - Mutual Information
KW - Pinsker inequality
KW - Statistical Distance
KW - Total variation
U2 - 10.1007/978-3-031-38271-0_29
DO - 10.1007/978-3-031-38271-0_29
M3 - Conference contribution
AN - SCOPUS:85172251582
SN - 9783031382703
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 291
EP - 306
BT - Geometric Science of Information - 6th International Conference, GSI 2023, Proceedings
A2 - Nielsen, Frank
A2 - Barbaresco, Frédéric
PB - Springer Science and Business Media Deutschland GmbH
T2 - The 6th International Conference on Geometric Science of Information, GSI 2023
Y2 - 30 August 2023 through 1 September 2023
ER -