zbMATH — the first resource for mathematics

Examples
Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

Operators
a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
Fields
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
Neural networks for pattern recognition. Repr. (English) Zbl 0868.68096
Oxford: Clarendon Press. xvii, 482 p. £ 25.00/pbk, £ 55.00/hbk (1996).
At present, the role of neural computing both in the area of pattern recognition and in various related branches is widely acknowledged. This outstanding book contributes remarkably to a better statistical understanding of artificial neural networks, their function and successful applicability to solving important practical problems. Its main feature is that it is focused rather on the most often used type of neural networks -- namely feed-forward networks -- instead of attempting to cover any interesting topic in the field of artificial neural networks. The text is divided into ten chapters. Fundamental principles of pattern recognition are summarized in Chapter 1. Chapter 2 discusses various methods for modelling the probability density functions. Chapter 3 is devoted to single-layer neural networks. This concept helps to explain several ideas and techniques applicable also to more general network structures. Multilayer perceptrons are introduced in Chapter 4. The following Chapter 5 deals with radial basis function networks. Various kinds of error functions and their properties for training neural networks are analyzed in Chapter 6. Chapter 7 treats some of the most important algorithms for network training. Chapter 8 states a number of methods related to data pre-processing, dimensionality reduction and the use of prior knowledge. The important problem of generalization in neural networks is examined in Chapter 9. The concluding Chapter 10 provides a detailed treatment of Bayesian techniques applicable to neural networks. Attached appendices contain some useful mathematical results (related to the properties of symmetric matrices, Gaussian integration, Lagrange multipliers, calculus of variations, and principal component analysis). The superior quality of this book is that it presents a comprehensive self-contained survey of feed-forward networks from the point of view of statistical pattern recognition. In this context, an impressive part of the text is represented by an extensive treatment of the latest developments in the field of Bayesian techniques and their applications to neural networks. However, the properties and relative merits of all the described neural network models are discussed in detail, together with common techniques for improving generalization abilities of trained networks. The author motivates the use of different types of error functions, explains their meaning and reviews various algorithms for error function minimization, too. Several methods for data processing and feature extraction are also covered here. Furthermore, the text provides the reader with a solid introduction of basic statistical concepts and enough references. This text is intended for those readers already familiar with mathematical foundations requires for an undergraduate science degree. However, assuming this fact, the author explains all the treated models step by step starting from the simplest linear models and continuing to the very latest Bayesian networks. In order to reinforce understanding of the treated concepts, graded exercises are attached at the end of each chapter. Therefore, this book can be used as a textbook for a graduate-level, or advanced undergraduate-level course on neural networks. Further, it is surely recommendable to researchers in neural computing. Anyway, despite of dealing rather with principles, it can be of great value also for those readers interested in practical applications providing them with a deep understanding of the discussed technology.

MSC:
68T10Pattern recognition, speech recognition
68T05Learning and adaptive systems
68-01Textbooks (computer science)