Bennett, M. A. (ed.) et al., Number theory for the millennium II. Proceedings of the millennial conference on number theory, Urbana-Champaign, IL, USA, May 21–26, 2000. Natick, MA: A K Peters (ISBN 1-56881-146-2/hbk). 149-166 (2002).

The paper is a very interesting survey article on the one hundred year development of the theory of normal numbers. “Normal” is in the sense of Borel, that is, a number

$\alpha $ (

$\in (0,1)$) is said to be normal, if in its

$q$-adic expansion (

$q>1$) for every

$q$ every configuration of

$n$ digits appears with the same asymptotic relative frequency

$q$. A classical result of Borel from the year 1909 states that all

$\alpha $’s, with an exception of a set of measure 0 are normal. The paper starts with mentioning Cantor’s proof of the fact that the continuum is uncountable; then it presents Borel’s just mentioned result, which is equivalent to the strong law of large numbers. Applications to the theory of Diophantine Approximation follow, containing results of Khintchine from the 1920’s and 30’s. Although almost all

$\alpha $ are normal, it was not without difficulty to exhibit a normal number. This was first done by Champernowne, who showed that, (written in the decimal system)

$\alpha =0\xb712345\xb7\xb7\xb7$ is normal. Copeland and Erdős and Davenport and Erdős exhibited further classes of normal numbers. Several problems are mentioned which are connected to Probability Theory. The difficulties which had to be overcome arise from the fact that the occurring random variables are not independent. As challenge problems a few unsolved problems are mentioned.