zbMATH — the first resource for mathematics

Phase transitions in machine learning. (English) Zbl 1246.68012
Cambridge: Cambridge University Press (ISBN 978-0-521-76391-2/hbk; 978-1-139-08903-6/ebook). xv, 383 p. (2011).
An inspiration of the book was the paper by P. Cheeseman, B. Kanefsky and W. Taylor [“Where the really hard problems are”, Proc. 12th Int. Conf., Sydney/Australia 1991, 331–337 (1991; Zbl 0747.68064)], in which the authors observed that the computational complexity (in terms of time and space) of an algorithm should be associated with solving typical problems instead of the worst-case. Additionally, these authors observed that a problem may have areas of different computational complexity and that borders between these areas may be sharp.
The idea of a phase transition is taken from physics. Physical systems are characterized by states called phases. For example, water may exist in solid, liquid, and gaseous phases. Thus, water is subjected to phase transitions.
This book starts from a chapter on statistical physics and phase transitions. Then the authors introduce basic concepts of the satisfiability of logical formulas in propositional calculus. In the next chapter the authors discuss constraint satisfaction problems, a class of NP-complete problems where a phase transition has been found to occur. Then they describe the main topic of the book: machine learning, or, more precisely, supervised learning, or learning from examples. The ideas of the hypothesis and data representation are thoroughly discussed. The next chapter covers searching the hypothesis space. The following chapter, “Statistical physics and machine learning”, contains information about artificial neural networks, support vector machines, and related topics.
The next chapter links satisfiability problem and the constraint satisfaction problem with learning. The authors show that the first order logic covering test has a phase transition associated with a complexity peak. Phase transitions associated with learning relations using the learners FOIL, SMART+, G-Net and PROGOL are explained in the next chapter. Phase transitions in learning grammars, complex networks are the next topics. The book ends with an overview of some open issues.
68-02 Research exposition (monographs, survey articles) pertaining to computer science
68T05 Learning and adaptive systems in artificial intelligence
68T20 Problem solving in the context of artificial intelligence (heuristics, search strategies, etc.)
82B26 Phase transitions (general) in equilibrium statistical mechanics
Full Text: DOI