×

Entropy and its many avatars. (English) Zbl 1357.94050

Summary: Entropy was first introduced in 1865 by Rudolf Clausius in his study of the connection between work and heat. A mathematical definition was given by Boltzmann as the logarithm of the number of micro states that corresponds to a macro state. It plays important roles in statistical mechanics, in the theory of large deviations in probability, as an invariant in ergodic theory and as a useful tool in communication theory. This article explores some of the connections between these different contexts.

MSC:

94A17 Measures of information, entropy
37A35 Entropy and other invariants, isomorphism, classification in ergodic theory
74A15 Thermodynamics in solid mechanics
60F10 Large deviations

References:

[1] J. Axzel and Z. Daroczy, On Measures of Information and Their Characterizations, Academic Press, New York, 1975.
[2] L. Boltzmann, Über die Mechanische Bedeutung des Zweiten Hauptsatzes der Wärmetheorie, Wiener Berichte, 53 (1866), 195-220.
[3] R. Clausius, Théorie mécanique de la chaleur, 1ère partie, Paris: Lacroix, 1868.
[4] H. Cramer, On a new limit theorem in the theory of probability, Colloquium on the Theory of Probability, Hermann, Paris, 1937. · JFM 53.0063.05
[5] J. D. Deuschel and D. W. Stroock, Large deviations, Pure and Appl. Math., 137 , Academic Press, Inc., Boston, MA, 1989, xiv+307 pp. · Zbl 0705.60029
[6] M. D. Donsker and S. R. S. Varadhan, Asymptotic evaluation of certain Markov process expectations for large time, IV, Comm. Pure Appl. Math., 36 (1983), 183-212. · Zbl 0512.60068 · doi:10.1002/cpa.3160360204
[7] A. Feinstein, A new basic theorem of information theory, IRE Trans. Information Theory PGIT-4 (1954), 2-22.
[8] L. Gross, Logarithmic Sobolev inequalities, Amer. J. Math., 97 (1975), 1061-1083. · Zbl 0318.46049 · doi:10.2307/2373688
[9] M. Z. Guo, G. C. Papanicolaou and S. R. S. Varadhan, Nonlinear diffusion limit for a system with nearest neighbor interactions, Comm. Math. Phys., 118 (1988), 31-59. · Zbl 0652.60107 · doi:10.1007/BF01218476
[10] A. I. Khinchin, On the fundamental theorems of information theory, Translated by Morris D. Friedman, 572 California St., Newtonville MA 02460, 1956, 84 pp. · Zbl 0075.14202
[11] A. N. Kolmogorov, A new metric invariant of transitive dynamical systems and automorphisms of Lebesgue spaces, (Russian) Topology, ordinary differential equations, dynamical systems, Trudy Mat. Inst., Steklov., 169 (1985), 94-98, 254.
[12] O. Lanford, Entropy and equilibrium states in classical statistical mechanics, Statistical Mechanics and Mathematical Problems, Lecture notes in Physics, 20 , Springer-Verlag, Berlin and New York, 1971, 1-113.
[13] D. S. Ornstein, Ergodic theory, randomness, and dynamical systems, James K. Whittemore Lectures in Mathematics given at Yale University, Yale Mathematical Monographs, No.,5. Yale University Press, New Haven, Conn.-London, 1974, vii+141 pp. · Zbl 0296.28016
[14] I. N. Sanov, On the probability of large deviations of random magnitudes, (Russian) Mat. Sb. (N. S.), 42 (84) (1957), 11-44.
[15] C. E. Shannon, A mathematical theory of communication, Bell System Tech. J., 27 (1948), 379-423, 623-656. · Zbl 1154.94303 · doi:10.1002/j.1538-7305.1948.tb01338.x
[16] Y. G. Sinai, On a weak isomorphism of transformations with invariant measure, (Russian) Mat. Sb. (N.S.), 63 (105) (1964), 23-42.
[17] H. T. Yau, Relative entropy and hydrodynamics of Ginzburg-Landau models, Lett. Math. Phys., 22 (1991), 63-80. · Zbl 0725.60120 · doi:10.1007/BF00400379
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.