×

zbMATH — the first resource for mathematics

Concentration theorems for entropy and free energy. (English) Zbl 1090.94011
Probl. Inf. Transm. 41, No. 2, 134-149 (2005); translation from Probl. Peredachi Inf. 2005, No. 2, 72-88 (2005).
Summary: Jaynes’s entropy concentration theorem states that, for most words \(\omega _{1} \cdots \omega_{N}\) of length \(N\) such that \(\sum_{i=1}^{N} f(\omega _i ) \approx vN\) , empirical frequencies of values of a function \(f\) are close to the probabilities that maximize the Shannon entropy given a value \(v\) of the mathematical expectation of \(f\). Using the notion of algorithmic entropy, we define the notions of entropy for the Bose and Fermi statistical models of unordered data. New variants of Jaynes’s concentration theorem for these models are proved. We also present some concentration properties for free energy in the case of a nonisolated isothermal system. Exact relations for the algorithmic entropy and free energy at extreme points are obtained. These relations are used to obtain tight bounds on fluctuations of energy levels at equilibrium points.

MSC:
94A17 Measures of information, entropy
82B03 Foundations of equilibrium statistical mechanics
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Jaynes, E.T., Papers on Probability, Statistics, and Statistical Physics, Dordrecht: Kluwer, 1989. · Zbl 0687.60085
[2] Cover, T.M. and Thomas, J.A., Elements of Information Theory, New York: Wiley, 1991. · Zbl 0762.94001
[3] Li, M. and Vitanyi, P., An Introduction to Kolmogorov Complexity and Its Applications, NewYork: Springer, 1997, 2nd ed.
[4] Stratonovich, R.L., Teoriya informatsii (Information Theory), Moscow: Sov. Radio, 1975.
[5] Landau, L.D. and Lifshitz, E.M., Statisticheskaya fizika, Part 1, Moscow: Nauka, 1976. Translated under the title Statistical Physics, vol. 1, Oxford, New York: Pergamon, 1980.
[6] Jaynes, E.T., How Should We Use Entropy in Economics? (Some Half-Baked Ideas in Need of Criticism), unpublished manuscript. Available from http://bayes.wustl.edu/et/etj/articles/entropy.in. economics.pdf.
[7] Maslov, V.P., Integral Equations and Phase Transitions in Probability Games. Analogy with Statistical Physics, Teor. Veroyatn. Primen., 2003, vol. 48, no.2, pp. 403–410 [Theory Probab. Appl. (Engl. Transl.), 2003, vol. 48, no. 2, pp. 359-367].
[8] Kolmogorov, A.N., Three Approaches to the Quantitative Definition of Information, Probl. Peredachi Inf., 1965, vol. 1, no.1, pp. 3–11 [Probl. Inf. Trans. (Engl. Transl.), 1965, vol. 1, no. 1, pp. 1–7]. · Zbl 0271.94018
[9] Kolmogorov, A.N., The Logical Basis for Information Theory and Probability Theory, IEEE Trans. Inform. Theory, 1968, vol. 14, no.3, pp. 662–664. · Zbl 0167.47601
[10] Zurek, W.H., Algorithmic Randomness and Physical Entropy, Phys. Rev. A, 1989, vol. 40, no.8, pp. 4731–4751.
[11] Rissanen, J., Minimum Description Length Principle, Encyclopaedia of Statistical Sciences, vol. 5, Kotz, S. and Johnson, N.L., Eds., New York: Wiley, 1986, pp. 523–527.
[12] Gacs, P., Tromp, J.,and Vitanyi, P., Algorithmic Statistics, IEEE Trans. Inform. Theory, 2001, vol. 47, no.6, pp. 2443–2463. · Zbl 1021.94004
[13] V’yugin, V.V. and Maslov, V.P., Extremal Relations between Additive Loss Functions and the Kolmogorov Complexity, Probl. Peredachi Inf., 2003, vol. 39, no.4, pp. 71–87 [Probl. Inf. Trans. (Engl. Transl.), 2003, vol. 39, no. 4, pp. 380–394].
[14] Bogolyubov, N.N., Energy Levels of the Non-Ideal Bose-Einsten Gas, Vestnik Moskov. Univ., 1947, vol. 7, pp. 43–56.
[15] Uspensky, V.A., Semenov, A.L., and Shen’, A.Kh., Can an Individual Sequence of Zeros and Ones Be Random?, Uspekhi Mat. Nauk, 1990, vol. 45, no.1, pp. 105–162 [Russian Math. Surveys (Engl. Transl.), 1990, vol. 45, no. 1, pp. 121–189].
[16] Kolmogorov, A.N. and Uspensky, V.A., Algorithms and Randomness, Teor. Veroyatn. Primen., 1987, vol. 32, no.3, pp. 425–455 [Theory Probab. Appl. (Engl. Transl.), 1987, vol. 32, no. 3, pp. 389–412].
[17] V’yugin, V.V., Algorithmic Complexity and Stochastic Properties of Finite Binary Sequences, Computer J., 1999, vol. 42, no.4, pp. 294–317. · Zbl 0937.68062
[18] Kolmogorov, A.N., Combinatorial Foundations of Information Theory and the Calculus of Probabilities, Uspekhi Mat. Nauk, 1983, vol. 38, no.4, pp. 27–36 [Russian Math. Surveys (Engl. Transl.), 1983, vol. 38, no. 4, pp. 29–40]. · Zbl 0597.60002
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.