A measure concentration inequality for contracting Markov chains. (English) Zbl 0856.60072

Summary: The concentration of measure phenomenon in product spaces means the following: if a subset \(A\) of the \(n\)th power of a probability space \(\mathcal X\) does not have too small a probability, then very large probability is concentrated in a small neighborhood of \(A\). The neighborhood is in many cases understood in the sense of Hamming distance, and then measure concentration is known to occur for product probability measures, and also for the distribution of some processes with very fast and uniform decay of memory. Recently, Talagrand introduced another notion of neighborhood of sets for which he proved a similar measure concentration inequality which in many cases allows more efficient applications than the one for a Hamming neighborhood. So far this inequality has only been proved for product distributions. The aim of this paper is to give a new proof of Talagrand’s inequality, which admits an extension to contracting Markov chains. The proof is based on a new asymmetric notion of distance between probability measures, and bounding this distance by informational divergence. As an application, we analyze the bin packing problem for Markov chains.


60J10 Markov chains (discrete-time Markov processes on discrete state spaces)
60E15 Inequalities; stochastic orderings
Full Text: DOI EuDML


[1] L. Breiman, Probability, Addison-Wesley, Reading, Massachusetts, 1968.
[2] R. Ahlswede, P. Gács, J. Körner, Bounds on conditional probabilities with applications in multi-user communication, Zeitschrift f. Wahrscheinlichkeitstheorie u. verw. Geb. 34 (1976), 157–177. · Zbl 0349.94038
[3] D. Amir, V. Milman, Unconditional symmetric sets inn-dimensional normed spaces, Israel J. of Math. 37 (1980), 3–20. · Zbl 0445.46011
[4] I. Csiszár, Information type measures of difference of probability distributions and indirect observations, Studia Sci. Math. Hung. 2 (1967), 297–318.
[5] I. Csiszár, J. Körner, Information Theory: Coding Theorems for Discrete Memoryless Systems, Academic Press, Inc., New York, London etc. 1981. · Zbl 0568.94012
[6] S. Kullback, Information Theory and Statistics, Wiley & Son, New York, London etc. 1959. · Zbl 0088.10406
[7] K. Marton, A simple proof of the blowing-up lemma, IEEE Trans. on Information Theory IT-32 (1986), 445–446. · Zbl 0594.94003
[8] K. Marton, Boundingd-distance by informational divergence: a way to prove measure concentration, Annals of Probability, to appear. · Zbl 0865.60017
[9] K. Marton, Measure concentration for a class of random processes, manuscript. · Zbl 0927.60050
[10] C. McDiarmid, On the method of bounded differences, in ”Surveys in Combinatorics” (J. Simons, ed.) London Mathematical Society Lecture Notes 141, Cambridge University Press, London-New York (1989), 148–188.
[11] V. Milman, The heritage of P. Lévy in geometrical and functional analysis, Asterisque 157–158 (1988), 273–301.
[12] M.S. Pinsker, Information and Information Stability of Random Variables and Processes, Holden-Day, San Francisco, 1964. · Zbl 0125.09202
[13] W. Rhee, On the fluctuations of the stochastic traveling salesperson problem, Math. of Operations Research 16 (1991), 482–489. · Zbl 0751.90080
[14] W. Rhee, A matching problem and subadditive Euclidean functionals, Ann. Applied Probab. 3 (1993), 794–801. · Zbl 0784.60020
[15] W. Rhee, On the fluctuations of simple matching, manuscript (1992).
[16] M. Talagrand, An isoperimetric theorem on the cube and the Khintchine-Kahane inequalities, Proc. Amer. Math. Soc. 104 (1988), 905–909. · Zbl 0691.60015
[17] M. Talagrand, Concentration of measure and isoperimetric inequalities in product spaces, Publications of IHES 81 (1995), 73–205. · Zbl 0864.60013
[18] M. Talagrand, Private communication.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.