##
**Simulated annealing and Boltzmann machines. A stochastic approach to combinatorial optimization and neural computing.**
*(English)*
Zbl 0674.90059

Wiley-Interscience Series in Discrete Mathematics and Optimization. Chichester (UK) etc.: Wiley. xii, 272 p. £24.95 (1989).

The book treats the two fashionable topics of its title in a unified way. While simulated annealing was introduced to solve combinatorial optimization problems, the Boltzmann Machine is a probabilistic approach to neural computing, which can in turn be used among other things to tackle combinatorial optimization problems.

Basically, a combinatorial optimization problem consists in finding the minimum of a function f over a finite set S. Simulated annealing uses an analogy with the physical process of annealing and can be viewed as a Markov chain over S depending on a slowly decreasing parameter called temperature in keeping with the analogy. At the beginning, transitions between states of S are relatively free, but as temperature decreases, transitions increasing the value of f become less and less probable. The book treats conditions under which the annealing process converges to the optimum solution. However, these make the algorithm prohibitively slow, hence any practical implementations will only produce approximations to the optimum. These implementations are discussed in the book. In the second part, Boltzmann machines are introduced as generalizations of simulated annealing to a model akin to neural networks. Thereby the speedup of massive parallelization is made accessible to simulated annealing. Applications to combinatorial optimization, as well as learning and recognizing are treated.

Basically, a combinatorial optimization problem consists in finding the minimum of a function f over a finite set S. Simulated annealing uses an analogy with the physical process of annealing and can be viewed as a Markov chain over S depending on a slowly decreasing parameter called temperature in keeping with the analogy. At the beginning, transitions between states of S are relatively free, but as temperature decreases, transitions increasing the value of f become less and less probable. The book treats conditions under which the annealing process converges to the optimum solution. However, these make the algorithm prohibitively slow, hence any practical implementations will only produce approximations to the optimum. These implementations are discussed in the book. In the second part, Boltzmann machines are introduced as generalizations of simulated annealing to a model akin to neural networks. Thereby the speedup of massive parallelization is made accessible to simulated annealing. Applications to combinatorial optimization, as well as learning and recognizing are treated.

Reviewer: Th.M.Liebling

### MSC:

90C27 | Combinatorial optimization |

68N25 | Theory of operating systems |

68T05 | Learning and adaptive systems in artificial intelligence |

68P10 | Searching and sorting |

90-01 | Introductory exposition (textbooks, tutorial papers, etc.) pertaining to operations research and mathematical programming |

68T10 | Pattern recognition, speech recognition |