×

Learning from data. Concepts, theory, and methods. (English) Zbl 0960.62002

Chichester: Wiley. xviii, 441 p. (1998).
This book is intended to provide a unified description of principles and methods for learning or estimating dependencies from data. Here a learning method is an algorithm, usually implemented in software, that estimates an unknown dependency between systems’ inputs and outputs from the available data, namely from known input-output samples. Methods for estimating dependencies from data have been traditionally explored in statistics (multivariate regression and classification), engineering (pattern recognition), and computer science (artificial intelligence and machine learning). Recent interest in learning from data has resulted in the development of biologically motivated methodologies, such as artificial neural networks, and eventually fuzzy systems.
The book can be seen as divided into two parts. The first one is devoted to concepts and theory (Chapters 1-4). After an informal introduction (Ch. 1), a formal specification of the learning problem, including an inductive principle, is described in Ch. 2. Then Ch. 3 brings the regularization or penalization framework adopted in statistics. The Vapnik’s statistical learning theory is described in Ch. 4.
Constructive methods are the content of part two of the book (Chs. 5-10). So Ch. 5 is devoted to nonlinear optimization strategies. Methods for density approximation, which include statistical, neural network, and signal-processing techniques for data and dimensionality reduction are described in Ch. 6. Ch. 7 provides a description of statistical and neural network methods for regression. Classification methods are described in Ch. 8. The support vector machine model for classification and regression is formulated in Ch. 9. Finally, Ch. 10 describes fuzzy inference systems, and especially the so-called neuro-fuzzy systems, where the estimation of fuzzy rules from data using neural networks heuristics is exploited.
To write a comprehensive treatise on learning from data methods is a rather hard task. One can say that this book reflects in a certain sense the view and experience of the authors. So, e.g., it does not cover methods of statistical description of learning. Also methods of learning by neural networks are not complete at all. Besides, robust statistical methods are even not mentioned in the book. Alas, methodologies of deterministic chaos for reconstruction of dynamics or dynamical models from data are also not mentioned. The book is heavily rooted in Vapnik’s theory of statistical learning [V.N. Vapnik, The nature of statistical learning theory. 2nd ed. (2000; Zbl 0934.62009)]. Incredibly, Ch. 4 of this book is, in a certain sense, a copy of the cited Vapnik’s book; e.g., figs. 4.1, 4.2, 4.3, 4.4, 4.5, and 4.9 are almost identical to figs. 2.1, 3.2, 3.1, 3.3, 4.4 and 4.2 given by Vapnik.
The important point in this book is that the data for learning are of the nature of being randomly generated by simulations, rather than real-life data. But despite these shortcomings one can say that the style of this book is appropriate. It is well readable and could be helpful to both beginning and advanced graduate students in engineering and statistics, as claimed by the authors.

MSC:

62-02 Research exposition (monographs, survey articles) pertaining to statistics
62M45 Neural nets and related approaches to inference from stochastic processes
93-02 Research exposition (monographs, survey articles) pertaining to systems and control theory
62-07 Data analysis (statistics) (MSC2010)
62B10 Statistical aspects of information-theoretic topics
93A30 Mathematical modelling of systems (MSC2010)
68-02 Research exposition (monographs, survey articles) pertaining to computer science
68T05 Learning and adaptive systems in artificial intelligence

Citations:

Zbl 0934.62009
PDFBibTeX XMLCite