×

zbMATH — the first resource for mathematics

Neural networks for pattern recognition. Repr. (English) Zbl 0868.68096
Oxford: Clarendon Press. xvii, 482 p., £55.00/hbk (1996).
At present, the role of neural computing both in the area of pattern recognition and in various related branches is widely acknowledged. This outstanding book contributes remarkably to a better statistical understanding of artificial neural networks, their function and successful applicability to solving important practical problems. Its main feature is that it is focused rather on the most often used type of neural networks – namely feed-forward networks – instead of attempting to cover any interesting topic in the field of artificial neural networks.
The text is divided into ten chapters. Fundamental principles of pattern recognition are summarized in Chapter 1. Chapter 2 discusses various methods for modelling the probability density functions. Chapter 3 is devoted to single-layer neural networks. This concept helps to explain several ideas and techniques applicable also to more general network structures. Multilayer perceptrons are introduced in Chapter 4. The following Chapter 5 deals with radial basis function networks. Various kinds of error functions and their properties for training neural networks are analyzed in Chapter 6. Chapter 7 treats some of the most important algorithms for network training. Chapter 8 states a number of methods related to data pre-processing, dimensionality reduction and the use of prior knowledge. The important problem of generalization in neural networks is examined in Chapter 9. The concluding Chapter 10 provides a detailed treatment of Bayesian techniques applicable to neural networks. Attached appendices contain some useful mathematical results (related to the properties of symmetric matrices, Gaussian integration, Lagrange multipliers, calculus of variations, and principal component analysis).
The superior quality of this book is that it presents a comprehensive self-contained survey of feed-forward networks from the point of view of statistical pattern recognition. In this context, an impressive part of the text is represented by an extensive treatment of the latest developments in the field of Bayesian techniques and their applications to neural networks. However, the properties and relative merits of all the described neural network models are discussed in detail, together with common techniques for improving generalization abilities of trained networks. The author motivates the use of different types of error functions, explains their meaning and reviews various algorithms for error function minimization, too. Several methods for data processing and feature extraction are also covered here. Furthermore, the text provides the reader with a solid introduction of basic statistical concepts and enough references.
This text is intended for those readers already familiar with mathematical foundations requires for an undergraduate science degree. However, assuming this fact, the author explains all the treated models step by step starting from the simplest linear models and continuing to the very latest Bayesian networks. In order to reinforce understanding of the treated concepts, graded exercises are attached at the end of each chapter. Therefore, this book can be used as a textbook for a graduate-level, or advanced undergraduate-level course on neural networks. Further, it is surely recommendable to researchers in neural computing. Anyway, despite of dealing rather with principles, it can be of great value also for those readers interested in practical applications providing them with a deep understanding of the discussed technology.
Reviewer: I.Mrazova (Praha)

MSC:
68T10 Pattern recognition, speech recognition
68T05 Learning and adaptive systems in artificial intelligence
68-01 Introductory exposition (textbooks, tutorial papers, etc.) pertaining to computer science
PDF BibTeX XML Cite