×

zbMATH — the first resource for mathematics

Learning of Chua’s circuit attractors by locally recurrent neural networks. (English) Zbl 0981.68135
Summary: Many practical applications of neural networks require the identification of strongly nonlinear (e.g., chaotic) systems. In this paper, Locally Recurrent Neural Networks (LRNNs) are used to learn the attractors of Chua’s circuit, a paradigm for studying chaos. LRNNs are characterized by a feed-forward structure whose synapses between adjacent layers have taps and feedback connections. In general, the learning procedures of LRNNs are computationally simpler than those of globally recurrent networks. Results show that LRNNs can be trained to identify the underlying link among Chua’s circuit state variables, and exhibit chaotic attractors under autonomous working conditions.

MSC:
68T05 Learning and adaptive systems in artificial intelligence
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Madan, R., Chua’s circuit: a paradigm for chaos, (1993), World Scientific Singapore · Zbl 0861.58026
[2] Chua, L.O., Global unfolding of Chua’s circuit, IEICE trans fundamentals-I, E76A, 704-734, (1993)
[3] Suykens, J.A.K.; Vandewalle, J., Generation of n-double scrolls, IEEE trans circuits syst - I, 40, 861-867, (1993) · Zbl 0844.58063
[4] Arena, P.; Baglio, S.; Fortuna, L.; Manganaro, G., Chua’s circuit can be generated by CNN cells, IEEE trans circuits syst - I, 42, 123-125, (1995)
[5] Arena, P.; Baglio, S.; Fortuna, L.; Manganaro, G., Generation of n-double scroll via cellular neural networks, Int J circuit theory applicat, 24, 241-252, (1996)
[6] Suykens, J.A.K.; Vandewalle, J., Learning a simple recurrent neural state space model to behave Chua’s double scroll, IEEE trans circuits syst - I, 42, 499-502, (1995)
[7] Wiener, N., Cybernetics, (1961), Wiley New York
[8] Pineda, F., Generalization of back-propagation to recurrent neural networks, Phys rev lett, 59, 2229-2232, (1987)
[9] Werbos, P., Backpropagation through time: what it does and how to do it, Proc IEEE, 2, 1550-1560, (1990), [Special Issue on Neural Networks]
[10] Campolucci, P.; Uncini, A.; Piazza, F.; Rao, B.D., On-line learning algorithms for locally recurrent neural networks, IEEE trans on neural networks, 10, 253-271, (1999)
[11] Back, D.; Tsoi, A.C., FIR and IIR synapses, a new neural network architecture for time series modeling, Neural comput., 3, 375-385, (1991)
[12] Back D, Wan EA, Lawrence S, Tsoi AC. A unifying view of some training algorithms for multilayer perceptrons with FIR filter synapses. In: Vlontzos J, Hwang J, Wilson E, editors. Neural networks for signal processing, vol. 4. New York: IEEE Press; 1995. p. 146-54
[13] Back, D.; Tsoi, A.C., A simplified gradient algorithm for IIR synapse multilayer perceptrons, Neural comput., 5, 456-462, (1993)
[14] Wan EA. Temporal backpropagation for FIR neural networks. In: Proceedings of the International Journal Conference in Neural Networks, San Diego (CA); 1990. p. 575-80
[15] Wan EA. Finite impulse response neural networks with applications in time series prediction. Ph.D. Dissertation, Stanford University, 1993
[16] Wan EA. Time series prediction by using a connectionist network with internal delay lines, Time series predictions: forecasting the future and understanding the past. In: Weigend AS, Gershenfeld NA, editors. SFI Studies in the Science of Complexity, XV. Reading, MA: Addison-Wesley; 1993. p. 195-217
[17] Cannas, B.; Cincotti, S.; Fanni, A.; Marchesi, M.; Pilo, F.; Usai, M., Performance analysis of locally recurrent neural networks, Int J COMPEL, 17, 708-716, (1998) · Zbl 0926.68122
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.