×

Global convergence and asymptotic stability of asymmetric Hopfield neural networks. (English) Zbl 0819.68101

Summary: The global convergence and asymptotic stability of Hopfield neural networks are known to be bases of successful application of networks in various computing and recognition tasks. However, all previous studies on the networks assumed that the interconnection matrix is symmetric or antisymmetric. In this paper the two fundamental properties of the network are studied without a symmetry assumption. It is proved that the networks will be globally convergent to a stable state if the interconnection matrix is weakly diagonally dominant in a sense to be defined. Furthermore, under any one of conditions assuring global convergence of the network, the maximal attraction radius of any one of stable states is found to be half of the distribution distance of the state to the network. The obtained results not only generalize the existing results, but also provide a theoretical foundation of performance analysis and new applications of the Hopfield networks.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI