×

Sensitivity analysis for neural networks. (English) Zbl 1189.68104

Natural Computing Series. Berlin: Springer (ISBN 978-3-642-02531-0/hbk; 978-3-642-02532-7/ebook). viii, 86 p. (2010).
Neural Networks are seen as an information paradigm inspired by the way the human brain processes information. They are applicable in virtually every situation in which a relationship between inputs and outputs exists. A main issue when using neural networks in practical applications is represented by the occurrence of perturbations in weights, caused by machine imprecision and the noise in data.
In this book, a sensitivity analysis is performed for Multi-Layer Perceptron (MLP) and Radial Basis Function Neural Networks (RBFNN). Concretely, in Chapter 3, the hyper-rectangle model for MLP is described, while in Chapter 4, the activation function is involved in the sensitivity analysis by parametrizing. Chapters 5 and 6 deal with the sensitivity analysis of RBFNN. Chapter 7 focuses on measuring sensitivity to encode prior knowledge into a neural network, and, in Chapter 8, the sensitivity analysis is applied in many issues, such as dimensionality reduction, network optimization, and selective learning.
The book may be used by researchers in diverse domains, such as neural networks, machine learning, computer engineering, etc., facing problems connected to sensitivity analysis of neural networks.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
68-02 Research exposition (monographs, survey articles) pertaining to computer science
PDFBibTeX XMLCite
Full Text: DOI