×

zbMATH — the first resource for mathematics

Mapping Bayesian networks to stochastic neural networks: a foundation for hybrid Bayesian-neural systems. (English) Zbl 0878.68100
Helsinki: Univ. Helsinki, Dept. of Comp. Sci. 93, 2 p. (1995).
Summary: We are interested in the problem of finding maximum a posteriori probability (MAP) value assignments for a set of discrete attributes, given the constraint that some of the attributes are permanently fixed to some values a priori. For building a system capable of this type of uncertain reasoning in practice, we need first to construct an accurate abstract representation of the problem domain, and then to establish an efficient search mechanism for finding MAP configurations within the constructed model. We propose a hybrid Bayesian network-neural network system for solving these two subtasks. The Bayesian network component can be used for constructing a compact, high-level representation for the problem domain probability distribution quickly and reliably, assuming that suitable expert knowledge is available. The neural network component provides then a computationally efficient, massively parallel platform for searching the model state space. The main application areas for these kinds of systems include configuration and design problems, medical diagnosing and pattern recognition.
For implementing a hybrid Bayesian-neural system as suggested above, we present here methods for mapping a given Bayesian network to a stochastic neural network architecture, in the sense that the resulting neural network updating process provably converges to a state which can be projected to a MAP state on the probability distribution corresponding to the original Bayesian network. From the neural network point of view, these mappings can be seen as a method for incorporating high-level probabilistic a priori information directly into neural networks, without recourse to a time-consuming and unreliable learning process. From the Bayesian network point of view, the mappings offer a massively parallel implementation of simulated annealing where all the variables can be updated at the same time. Our empirical simulations suggest that this type of massively parallel simulated annealing outperforms the traditional sequential Gibbs sampling/simulated annealing process, provided that suitable hardware is available.
MSC:
68T05 Learning and adaptive systems in artificial intelligence
68-02 Research exposition (monographs, survey articles) pertaining to computer science
Software:
BaRT
PDF BibTeX XML Cite