zbMATH — the first resource for mathematics

Convergent decomposition techniques for training RBF neural networks. (English) Zbl 0986.68109
Summary: We define globally convergent decomposition algorithms for supervised training of generalized radial basis function neural networks. First, we consider training algorithms based on the two-block decomposition of the network parameters into the vector of weights and the vector of centers. Then we define a decomposition algorithm in which the selection of the center locations is split into sequential minimizations with respect to each center, and we give a suitable criterion for choosing the centers that must be updated at each step. We prove the global convergence of the proposed algorithms and report the computational results obtained for a set of test problems.

68T05 Learning and adaptive systems in artificial intelligence
Full Text: DOI
[1] DOI: 10.1080/10556789908805730 · Zbl 0940.65070 · doi:10.1080/10556789908805730
[2] DOI: 10.1016/S0167-6377(99)00074-7 · Zbl 0955.90128 · doi:10.1016/S0167-6377(99)00074-7
[3] DOI: 10.1162/neco.1989.1.2.281 · doi:10.1162/neco.1989.1.2.281
[4] DOI: 10.1109/5.58326 · doi:10.1109/5.58326
[5] DOI: 10.1007/BF01584660 · Zbl 0258.90043 · doi:10.1007/BF01584660
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.