×

Reservoir computing approaches to recurrent neural network training. (English) Zbl 1302.68235

Summary: Echo State Networks and Liquid State Machines introduced a new paradigm in artificial recurrent neural network (RNN) training, where an RNN (the reservoir) is generated randomly and only a readout is trained. The paradigm, becoming known as reservoir computing, greatly facilitated the practical application of RNNs and outperformed classical fully trained RNNs in many tasks. It has lately become a vivid research field with numerous extensions of the basic idea, including reservoir adaptation, thus broadening the initial paradigm to using different methods for training the reservoir and the readout. This review systematically surveys both current ways of generating/adapting the reservoirs and training different types of readouts. It offers a natural conceptual classification of the techniques, which transcends boundaries of the current “brand-names” of reservoir methods, and thus aims to help in unifying the field and providing the reader with a detailed “map” of it.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
92B20 Neural networks for/in biological studies, artificial life and related topics
68-02 Research exposition (monographs, survey articles) pertaining to computer science

Software:

darch; Evolino
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Hopfield, John J., Hopfield network, Scholarpedia, 2, 5, 1977, (2007)
[2] Hopfield, John J., Neural networks and physical systems with emergent collective computational abilities, Proceedings of the national Academy of sciences of the united states of America, 79, 2554-2558, (1982) · Zbl 1369.92007
[3] Hinton, Geoffrey E., Boltzmann machine, Scholarpedia, 2, 5, 1668, (2007)
[4] Ackley, David H.; Hinton, Geoffrey E.; Sejnowski, Terrence J., A learning algorithm for Boltzmann machines, Cognitive science, 9, 147-169, (1985)
[5] Hinton, Geoffrey E.; Salakhutdinov, Ruslan, Reducing the dimensionality of data with neural networks, Science, 313, 5786, 504-507, (2006) · Zbl 1226.68083
[6] Taylor, Graham W.; Hinton, Geoffrey E.; Roweis, Sam, Modeling human motion using binary latent variables, (), 1345-1352
[7] Funahashi, Ken-ichi; Nakamura, Yuichi, Approximation of dynamical systems by continuous time recurrent neural networks, Neural networks, 6, 801-806, (1993)
[8] Kenji Doya, Bifurcations in the learning of recurrent neural networks, in: Proceedings of IEEE International Symposium on Circuits and Systems 1992, vol. 6, 1992, pp. 2777-2780
[9] Bengio, Yoshua; Simard, Patrice; Frasconi, Paolo, Learning long-term dependencies with gradient descent is difficult, IEEE transactions on neural networks, 5, 2, 157-166, (1994)
[10] Gers, Felix A.; Schmidhuber, Jürgen; Cummins, Fred A., Learning to forget: continual prediction with LSTM, Neural computation, 12, 10, 2451-2471, (2000)
[11] Maass, Wolfgang; Natschläger, Thomas; Markram, Henry, Real-time computing without stable states: A new framework for neural computation based on perturbations, Neural computation, 14, 11, 2531-2560, (2002) · Zbl 1057.68618
[12] Herbert Jaeger, The “echo state” approach to analysing and training recurrent neural networks, Technical Report GMD Report 148, German National Research Center for Information Technology, 2001
[13] Dominey, Peter F., Complex sensory-motor sequence learning based on recurrent state representation and reinforcement learning, Biological cybernetics, 73, 265-274, (1995) · Zbl 0828.92003
[14] Jochen J. Steil, Backpropagation-decorrelation: Recurrent learning with O(N) complexity, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2004, IJCNN 2004, vol. 2, 2004, pp. 843-848
[15] Jaeger, Herbert; Maass, Wolfgang; Príncipe, José C., Special issue on echo state networks and liquid state machines — editorial, Neural networks, 20, 3, 287-289, (2007)
[16] Jaeger, Herbert, Echo state network, Scholarpedia, 2, 9, 2330, (2007)
[17] Jaeger, Herbert; Haas, Harald, Harnessing nonlinearity: predicting chaotic systems and saving energy in wireless communication, Science, 78-80, (2004)
[18] Jaeger, Herbert; Lukoševičius, Mantas; Popovici, Dan; Siewert, Udo, Optimization and applications of echo state networks with leaky-integrator neurons, Neural networks, 20, 3, 335-352, (2007) · Zbl 1132.68554
[19] David Verstraeten, Benjamin Schrauwen, Dirk Stroobandt, Reservoir-based techniques for speech recognition, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2006, IJCNN 2006, 2006 pp. 1050-1053
[20] Maass, Wolfgang; Natschläger, Thomas; Markram, Henry, A model for real-time computation in generic neural microcircuits, (), 213-220 · Zbl 1057.68618
[21] Maass, Wolfgang; Joshi, Prashant; Sontag, Eduardo D., Principles of real-time computing with feedback applied to cortical microcircuit models, (), 835-842
[22] Buonomano, Dean V.; Merzenich, Michael M., Temporal information transformed into a spatial code by a neural network with realistic properties, Science, 267, 1028-1030, (1995)
[23] Haeusler, Stefan; Maass, Wolfgang, A statistical analysis of information-processing properties of lamina-specific cortical microcircuit models, Cerebral cortex, 17, 1, 149-162, (2007)
[24] Karmarkar, Uma R.; Buonomano, Dean V., Timing in the absence of clocks: encoding time in neural network states, Neuron, 53, 3, 427-438, (2007)
[25] Stanley, Garrett B.; Li, Fei F.; Dan, Yang, Reconstruction of natural scenes from ensemble responses in the lateral genicualate nucleus, Journal of neuroscience, 19, 18, 8036-8042, (1999)
[26] Nikolić, Danko; Haeusler, Stefan; Singer, Wolf; Maass, Wolfgang, Temporal dynamics of information content carried by neurons in the primary visual cortex, (), 1041-1048
[27] Kistler, Werner M.; Zeeuw, Chris I. De, Dynamical working memory and timed responses: the role of reverberating loops in the olivo-cerebellar system, Neural computation, 14, 2597-2626, (2002) · Zbl 1026.92010
[28] Yamazaki, Tadashi; Tanaka, Shigeru, The cerebellum as a liquid state machine, Neural networks, 20, 3, 290-297, (2007) · Zbl 1132.68611
[29] Dominey, Peter F.; Hoen, Michel; Blanc, Jean-Marc; Lelekov-Boissard, Taïssia, Neurological basis of language and sequential cognition: evidence from simulation, aphasia, and ERP studies, Brain and language, 86, 207-225, (2003)
[30] Blanc, Jean-Marc; Dominey, Peter F., Identification of prosodic attitudes by atemporal recurrent network, Cognitive brain research, 17, 693-699, (2003)
[31] Dominey, Peter F.; Hoen, Michel; Inui, Toshio, A neurolinguistic model of grammatical construction processing, Journal of cognitive neuroscience, 18, 12, 2088-2107, (2006)
[32] French, Robert M., Catastrophic interference in connectionist networks, (), 431-435
[33] Takens, Floris, Detecting strange attractors in turbulence, (), 366-381 · Zbl 0513.58032
[34] Williams, Ronald J.; Zipser, David, A learning algorithm for continually running fully recurrent neural networks, Neural computation, 1, 270-280, (1989)
[35] Rumelhart, David E.; Hinton, Geoffrey E.; Williams, Ronald J., Learning internal representations by error propagation, (), 673-695
[36] Werbos, Paul J., Backpropagation through time: what it does and how to do it, Proceedings of the IEEE, 78, 10, 1550-1560, (1990)
[37] Atiya, Amir F.; Parlos, Alexander G., New results on recurrent network training: unifying the algorithms and accelerating convergence, IEEE transactions on neural networks, 11, 3, 697-709, (2000)
[38] Puškorius, Gintaras V.; Feldkamp, Lee A., Neurocontrol of nonlinear dynamical systems with Kalman filter trained recurrent networks, IEEE transactions on neural networks, 5, 2, 279-297, (1994)
[39] Ma, Sheng; Ji, Chuanyi, Fast training of recurrent networks based on the EM algorithm, IEEE transactions on neural networks, 9, 1, 11-26, (1998)
[40] Hochreiter, Sepp; Schmidhuber, Jürgen, Long short-term memory, Neural computation, 9, 8, 1735-1780, (1997)
[41] Herbert Jaeger, Short term memory in echo state networks, Technical Report GMD Report 152, German National Research Center for Information Technology, 2002
[42] Herbert Jaeger, Tutorial on training recurrent neural networks, covering BPTT, RTRL, EKF and the “echo state network” approach, Technical Report GMD Report 159, German National Research Center for Information Technology, 2002
[43] Jaeger, Herbert, Adaptive nonlinear system identification with echo state networks, (), 593-600
[44] Natschläger, Thomas; Markram, Henry; Maass, Wolfgang, Computer models and analysis tools for neural microcircuits, (), (chapter 9) · Zbl 1057.68618
[45] Maass, Wolfgang; Natschläger, Thomas; Markram, Henry, Computational models for generic cortical microcircuits, () · Zbl 1057.68618
[46] Schmidhuber, Jürgen; Wierstra, Daan; Gagliolo, Matteo; Gomez, Faustino J., Training recurrent networks by evolino, Neural computation, 19, 3, 757-779, (2007) · Zbl 1127.68085
[47] Schiller, Ulf D.; Steil, Jochen J., Analyzing the weight dynamics of recurrent learning algorithms, Neurocomputing, 63C, 5-23, (2005)
[48] Steil, Jochen J., Memory in backpropagation-decorrelation O(N) efficient online recurrent learning, (), 649-654, (chapter 9)
[49] Felix R. Reinhart, Jochen J. Steil, Recurrent neural autoassociative learning of forward and inverse kinematics for movement generation of the redundant PA-10 robot, in: Proceedings of the ECSIS Symposium on Learning and Adaptive Behaviors for Robotic Systems, LAB-RS, vol. 1, 2008, 35-40
[50] Dominey, Peter F.; Ramus, Franck, Neural network processing of natural language: I. sensitivity to serial, temporal and abstract structure of language in the infant, Language and cognitive processes, 15, 1, 87-127, (2000)
[51] Dominey, Peter F., From sensorimotor sequence to grammatical construction: evidence from simulation and neurophysiology, Adaptive behaviour, 13, 4, 347-361, (2005)
[52] Se Wang, Xiao-Jian Yang, Cheng-Jian Wei, Harnessing non-linearity by sigmoid-wavelet hybrid echo state networks (SWHESN), in: The 6th World Congress on Intelligent Control and Automation, WCICA 2006, 1, 2006, pp. 3014-3018
[53] David Verstraeten, Benjamin Schrauwen, Dirk Stroobandt, Reservoir computing with stochastic bitstream neurons, in: Proceedings of the 16th Annual ProRISC Workshop, Veldhoven, The Netherlands, November 2005, pp. 454-459
[54] Schürmann, Felix; Meier, Karlheinz; Schemmel, Johannes, Edge of chaos computation in mixed-mode VLSI - A hard liquid, (), 1201-1208
[55] Schrauwen, Benjamin; Wardermann, Marion; Verstraeten, David; Steil, Jochen J.; Stroobandt, Dirk, Improving reservoirs using intrinsic plasticity, Neurocomputing, 71, 1159-1171, (2008)
[56] Vandoorne, Kristof; Dierckx, Wouter; Schrauwen, Benjamin; Verstraeten, David; Baets, Roel; Bienstman, Peter; Campenhout, Jan Van, Toward optical signal processing using photonic reservoir computing, Optics express, 16, 15, 11182-11192, (2008)
[57] Fernando, Chrisantha; Sojakka, Sampsa, Pattern recognition in a bucket, (), 588-597
[58] Ben Jones, Dov Stekelo, Jon Rowe, Chrisantha Fernando, Is there a liquid state machine in the bacterium Escherichia coli?, in: Proceedings of the 1st IEEE Symposium on Artificial Life, ALIFE 2007, 1-5 April 2007, pp. 187-191
[59] Verstraeten, David; Schrauwen, Benjamin; D’Haene, Michiel; Stroobandt, Dirk, An experimental unification of reservoir computing methods, Neural networks, 20, 3, 391-403, (2007) · Zbl 1132.68605
[60] Benjamin Schrauwen, David Verstraeten, Jan Van Campenhout, An overview of reservoir computing: Theory, applications and implementations, in: Proceedings of the 15th European Symposium on Artificial Neural Networks, ESANN 2007, 2007, pp. 471-482
[61] Mantas Lukoševičius, Herbert Jaeger, Overview of reservoir recipes, Technical Report No. 11, Jacobs University Bremen, 2007
[62] David H. Wolpert, The supervised learning no-free-lunch theorems, in: Proceedings of the 6th Online World Conference on Soft Computing in Industrial Applications, WSC 2006, 2001, pp. 25-42
[63] Buehner, Michael; Young, Peter, A tighter bound for the echo state property, IEEE transactions on neural networks, 17, 3, 820-824, (2006)
[64] Watts, Duncan J.; Strogatz, Steven H., Collective dynamics of ‘small-world’ networks, Nature, 393, 440-442, (1998) · Zbl 1368.05139
[65] Barabasi, Albert-Laszlo; Albert, Reka, Emergence of scaling in random networks, Science, 286, 509, (1999) · Zbl 1226.05223
[66] Kaiser, Marcus; Hilgetag, Claus C., Spatial growth of real-world networks, Physical review E, 69, 036103, (2004)
[67] Benjamin Liebald, Exploration of effects of different network topologies on the ESN signal crosscorrelation matrix spectrum, Bachelor’s Thesis, Jacobs University Bremen, 2004 http://www.eecs.jacobs-university.de/archive/bsc-2004/liebald.pdf
[68] Jiang, Fei; Berry, Hugues; Schoenauer, Marc, Supervised and evolutionary learning of echo state networks., (), 215-224
[69] Maass, Wolfgang; Natschläger, Thomas; Markram, Henry, Computational models for generic cortical microcircuits, (), 575-605
[70] Verstraeten, David; Schrauwen, Benjamin; Stroobandt, Dirk; Campenhout, Jan Van, Isolated word recognition with the liquid state machine: A case study, Information processing letters, 95, 6, 521-528, (2005) · Zbl 1184.68257
[71] Michal Čerňanský, Matej Makula, Feed-forward echo state networks, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2005, IJCNN 2005, vol. 3, 2005, pp. 1479-1482
[72] Fette, Georg; Eggert, Julian, Short term memory and pattern matching with simple echo state network, (), 13-18
[73] David Verstraeten, Benjamin Schrauwen, Dirk Stroobandt, Adapting reservoirs to get Gaussian distributions, in: Proceedings of the 15th European Symposium on Artificial Neural Networks, ESANN 2007, 2007, pp. 495-500
[74] Carlos Lourenço, Dynamical reservoir properties as network effects, in: Proceedings of the 14th European Symposium on Artificial Neural Networks, ESANN 2006, 2006, pp. 503-508
[75] Keith Bush, Batsukh Tsendjav, Improving the richness of echo state features using next ascent local search, in: Proceedings of the Artificial Neural Networks In Engineering Conference, St. Louis, MO, 2005, pp. 227-232
[76] Hajnal, Márton Albert; Lőrincz, András, Critical echo state networks, (), 658-667
[77] Xue, Yanbo; Yang, Le; Haykin, Simon, Decoupled echo state networks with lateral inhibition, Neural networks, 20, 3, 365-376, (2007) · Zbl 1132.68610
[78] Mantas Lukoševičius, Echo state networks with trained feedbacks, Technical Report No. 4, Jacobs University Bremen, 2007
[79] Danil V. Prokhorov, Lee A. Feldkamp, Ivan Yu. Tyukin, Adaptive behavior with fixed weights in RNN: An overview, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2002, IJCNN 2002, 2002, pp. 2018-2023
[80] Mohamed Oubbati, Paul Levi, Michael Schanz, Meta-learning for adaptive identification of non-linear dynamical systems, in: Proceedings of the IEEE International Joint Symposium on Intelligent Control, June 2005, pp. 473-478
[81] Mantas Lukoševičius, Dan Popovici, Herbert Jaeger, Udo Siewert, Time warping invariant echo state networks, Technical Report No. 2, Jacobs University Bremen, 2006
[82] Schrauwen, Benjamin; Defour, Jeroen; Verstraeten, David; Campenhout, Jan M. Van, The introduction of time-scales in reservoir computing, applied to isolated digits recognition, (), 471-479
[83] Udo Siewert, Welf Wustlich, Echo-state networks with band-pass neurons: Towards generic time-scale-independent reservoir structures, Internal Status Report, PLANET intelligent systems GmbH, 2007. Available online at http://snn.elis.ugent.be/
[84] Georg Holzmann, Echo state networks with filter neurons and a delay&sum readout, Internal Status Report, Graz University of Technology, 2007. Available online at http://grh.mur.at/data/misc.html
[85] Francis wyffels, Benjamin Schrauwen, David Verstraeten, Stroobandt Dirk, Band-pass reservoir computing, in: Z. Hou, and N. Zhang (Eds.), Proceedings of the IEEE International Joint Conference on Neural Networks, 2008, IJCNN 2008, Hong Kong, 2008, pp. 3204-3209
[86] Hihi, Salah El; Bengio, Yoshua, Hierarchical recurrent neural networks for long-term dependencies, (), 493-499
[87] Bertschinger, Nils; Natschläger, Thomas, Real-time computation at the edge of chaos in recurrent neural networks, Neural computation, 16, 7, 1413-1436, (2004) · Zbl 1102.68530
[88] Benjamin Schrauwen, Lars Buesing, Robert Legenstein, On computational power and the order-chaos phase transition in reservoir computing, in: Advances in Neural Information Processing Systems 21, NIPS 2008, 2009, pp. 1425-1432
[89] Maass, Wolfgang; Legenstein, Robert A.; Bertschinger, Nils, Methods for estimating the computational power and generalization capability of neural microcircuits, (), 865-872
[90] Legenstein, Robert A.; Maass, Wolfgang, Edge of chaos and prediction of computational performance for neural circuit models, Neural networks, 20, 3, 323-334, (2007) · Zbl 1132.68568
[91] Farhang-Boroujeny, Behrouz, Adaptive filters: theory and applications, (1998), Wiley
[92] Herbert Jaeger, Reservoir riddles: suggestions for echo state network research, Proceedings of the IEEE International Joint Conference on Neural Networks, 2005, IJCNN 2005, vol. 3, 2005, pp. 1460-1462
[93] Jochen Triesch, A gradient rule for the plasticity of a neuron’s intrinsic excitability, in: Proceedings of the 13th European Symposium on Artificial Neural Networks, ESANN 2005, 2005, pp. 65-70
[94] Mitchell, Melanie; Crutchfield, James P.; Hraber, Peter T., Dynamics, computation, and the “edge of chaos”: A re-examination, (), 497-513
[95] Legenstein, Robert; Maass, Wolfgang, What makes a dynamical system computationally powerful?, (), 127-154
[96] Mustafa C. Ozturk, José C. Príncipe, Computing with transiently stable states, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2005, IJCNN 2005, vol. 3, 2005, pp. 1467-1472
[97] Hebb, Donald O., The organization of behavior: A neuropsychological theory, (1949), Wiley New York
[98] Štefan Babinec; Pospíchal, Jiří, Improving the prediction accuracy of echo state neural networks by anti-oja’s learning, (), 19-28 · Zbl 1399.68080
[99] Markram, Henry; Wang, Yun; Tsodyks, Misha, Differential signaling via the same axon of neocortical pyramidal neurons, Proceedings of national Academy of sciences USA, 95, 9, 5323-5328, (1998)
[100] David Norton, Dan Ventura, Preparing more effective liquid state machines using Hebbian learning, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2006, IJCNN 2006, 2006, pp. 4243-4248
[101] Paugam-Moisy, Hélène; Martinez, Regis; Bengio, Samy, Delay learning and polychronization for reservoir computing, Neurocomputing, 71, 7-9, 1143-1158, (2008)
[102] Baddeley, Roland; Abbott, Larry F.; Booth, Michael C.A.; Sengpeil, Frank; Freeman, Toby; Wakeman, Edward A.; Rolls, Edmund T., Responses of neurons in primary and inferior temporal visual cortices to natural scenes, Proceedings of the royal society of London B, 264, 1775-1783, (1997)
[103] Stemmler, Martin; Koch, Christof, How voltage-dependent conductances can adapt to maximize the information encoded by neuronal firing rate, Nature neuroscience, 2, 6, 521-527, (1999)
[104] Triesch, Jochen, Synergies between intrinsic and synaptic plasticity in individual model neurons, (), 1417-1424 · Zbl 1118.68128
[105] Bell, Anthony J.; Sejnowski, Terrence J., An information-maximization approach to blind separation and blind deconvolution, Neural computation, 7, 6, 1129-1159, (1995)
[106] Steil, Jochen J., Online reservoir adaptation by intrinsic plasticity for backpropagation-decorrelation and echo state learning, Neural networks, 20, 3, 353-364, (2007) · Zbl 1132.68594
[107] Marion Wardermann, Jochen J. Steil, Intrinsic plasticity for reservoir learning algorithms, in: Proceedings of the 15th European Symposium on Artificial Neural Networks, ESANN 2007, 2007, pp. 513-518
[108] Jochen J. Steil, Several ways to solve the MSO problem, in: Proceedings of the 15th European Symposium on Artificial Neural Networks, ESANN 2007, 2007, pp. 489-494
[109] Joschka Boedecker, Oliver Obst, Norbert Michael Mayer, Minoru Asada, Studies on reservoir initialization and dynamics shaping in echo state networks, in: Proceedings of the 17th European Symposium on Artificial Neural Networks, ESANN 2009, 2009 (in press)
[110] Tishby, Naftali; Pereira, Fernando C.; Bialek, William, The information bottleneck method, (), 368-377
[111] Toyoizumi, Taro; Pfister, Jean-Pascal; Aihara, Kazuyuki; Gerstner, Wulfram, Generalized bienenstock – cooper – munro rule for spiking neurons that maximizes information transmission, Proceedings of national Academy of sciences USA, 102, 5239-5244, (2005)
[112] Klampfl, Stefan; Legenstein, Robert; Maass, Wolfgang, Information bottleneck optimization and independent component extraction with spiking neurons, (), 713-720 · Zbl 1157.92302
[113] Klampfl, Stefan; Legenstein, Robert; Maass, Wolfgang, Spiking neurons can learn to solve information bottleneck problems and to extract independent components, Neural computation, 21, 4, 911-959, (2008) · Zbl 1157.92302
[114] Buesing, Lars; Maass, Wolfgang, Simplified rules and theoretical analysis for information bottleneck optimization and PCA with spiking neurons, (), 193-200
[115] Triesch, Jochen, Synergies between intrinsic and synaptic plasticity mechanisms, Neural computation, 19, 4, 885-909, (2007) · Zbl 1118.68128
[116] Butko, Nicholas J.; Triesch, Jochen, Learning sensory representations with intrinsic plasticity, Neurocomputing, 70, 1130-1138, (2007)
[117] Lazar, Andreea; Pipa, Gordon; Triesch, Jochen, Fading memory and time series prediction in recurrent networks with different forms of plasticity, Neural networks, 20, 3, 312-322, (2007) · Zbl 1132.68567
[118] Norbert M. Mayer, Matthew Browne, Echo state networks and self-prediction, in: Revised Selected Papers of Biologically Inspired Approaches to Advanced Information Technology, BioADIT 2004, 2004, pp. 40-48
[119] Ozturk, Mustafa C.; Xu, Dongming; Príncipe, José C., Analysis and design of echo state networks, Neural computation, 19, 1, 111-138, (2007) · Zbl 1125.68102
[120] Kautz, William H., Transient synthesis in the time domain, IRE transactions on circuit theory, 1, 3, 29-39, (1954)
[121] Kazuo Ishii, Tijn van der Zant, Vlatko Bečanović, Paul Plöger, Identification of motion with echo state network, in: Proceedings of the OCEANS 2004 MTS/IEEE-TECHNO-OCEAN 2004 Conference, vol. 3, 2004, pp. 1205-1210
[122] Holland, John H., Adaptation in natural and artificial systems: an introductory analysis with applications to biology control and artificial intelligence, (1992), MIT Press Cambridge, MA, USA
[123] Rad, Ali Ajdari; Jalili, Mahdi; Hasler, Martin, Reservoir optimization in recurrent neural networks using Kronecker kernels, (), 868-871 · Zbl 1213.68503
[124] Xavier Dutoit, Hendrik Van Brussel, Marnix Nutti, A first attempt of reservoir pruning for classification problems, in: Proceedings of the 15th European Symposium on Artificial Neural Networks, ESANN 2007, 2007, pp. 507-512
[125] Maass, Wolfgang; Joshi, Prashant; Sontag, Eduardo D., Computational aspects of feedback in neural circuits, Plos computational biology, 3, 1, e165+, (2007)
[126] Legenstein, Robert; Pecevski, Dejan; Maass, Wolfgang, Theoretical analysis of learning with reward-modulated spike-timing-dependent plasticity, (), 881-888
[127] Legenstein, Robert; Pecevski, Dejan; Maass, Wolfgang, A learning theory for reward-modulated spike-timing-dependent plasticity with application to biofeedback, Plos computational biology, 4, 10, e1000180, (2008)
[128] A˚ke Björck, Numerical method for least squares problems, (1996), SIAM Philadelphia, PA, USA
[129] Andrew Carnell, Daniel Richardson, Linear algebra for time series of spikes, in: Proceedings of the 13th European Symposium on Artificial Neural Networks, ESANN 2005, 2005, pp. 363-368
[130] Ali U. Küçükemre, Echo state networks for adaptive filtering, University of Applied Sciences Bohn-Rhein-Sieg, Germany, April 2006. http://www.faculty.jacobs-university.de/hjaeger/pubs/Kucukemre.pdf
[131] Shi, Zhinwei; Han, Min, Support vector echo-state machine for chaotic time-series prediction, IEEE transactions on neural networks, 18, 2, 359-372, (2007)
[132] Jürgen Schmidhuber, Matteo Gagliolo, Daan Wierstra, Faustino J. Gomez, Evolino for recurrent support vector machines. Technical Report, 2006 · Zbl 1127.68085
[133] Schrauwen, Benjamin; Campenhout, Jan Van, Linking non-binned spike train kernels to several existing spike train metrics, (), 41-46
[134] Jochen J. Steil, Stability of backpropagation-decorrelation efficient O(N) recurrent learning, in: Proceedings of the 13th European Symposium on Artificial Neural Networks, ESANN 2005, 2005, pp. 43-48
[135] wyffels, Francis; Schrauwen, Benjamin; Stroobandt, Dirk, Stable output feedback in reservoir computing using ridge regression, (), 808-817
[136] Xavier Dutoit, Benjamin Schrauwen, Jan Van Campenhout, Dirk Stroobandt, Hendrik Van Brussel, Marnix Nuttin, Pruning and regularization in reservoir computing: A first insight, in: Proceedings of the 16th European Symposium on Artificial Neural Networks, ESANN 2008, 2008, pp. 1-6
[137] Joe Tebelskis, Ph.D. Thesis, Speech Recognition using Neural Networks, School of Computer Science, Carnegie Mellon University, Pittsburgh, Pennsylvania, 1995
[138] Skowronski, Mark D.; Harris, John G., Automatic speech recognition using a predictive echo state network classifier, Neural networks, 20, 3, 414-423, (2007) · Zbl 1132.68663
[139] Dongming Xu, Jing Lan, José C. Príncipe, Direct adaptive control: An echo state network and genetic algorithm approach, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2005, IJCNN 2005, vol. 3, 2005, pp. 1483-1486
[140] Devert, Alexandre; Bredeche, Nicolas; Schoenauer, Marc, Unsupervised learning of echo state networks: a case study in artificial embryogeny, (), 278-290
[141] Jiang, Fei; Berry, Hugues; Schoenauer, Marc, Unsupervised learning of echo state networks: balancing the double pole, (), 869-870
[142] Keith Bush, Charles Anderson, Modeling reward functions for incomplete state representations via echo state networks, in: Proceedings of the IEEE International Joint Conference on Neural Networks, 2005, IJCNN 2005, vol. 5, 2005, pp. 2995-3000
[143] Štefan Babinec; Pospíchal, Jiří, Merging echo state and feedforward neural networks for time series forecasting, (), 367-375 · Zbl 1399.68080
[144] Ozturk, Mustafa C.; Príncipe, José C., An associative memory readout for ESNs with applications to dynamical pattern recognition, Neural networks, 20, 3, 377-390, (2007) · Zbl 1132.68654
[145] Mark Embrechts, Luis Alexandre, Jonathan Linton, Reservoir computing for static pattern recognition, in: Proceedings of the 17th European Symposium on Artificial Neural Networks, ESANN 2009, 2009 (in press)
[146] Felix R. Reinhart, Jochen J. Steil, Attractor-based computation with reservoirs for online learning of inverse kinematics, in: Proceedings of the 17th European Symposium on Artificial Neural Networks, ESANN 2009, 2009 (in press)
[147] Keith Bush, Charles Anderson, Exploiting iso-error pathways in the N, k-plane to improve echo state network performance, 2006
[148] Bengio, Yoshua; LeCun, Yann, Scaling learning algorithms toward AI, ()
[149] Herbert Jaeger, Discovering multiscale dynamical features with hierarchical echo state networks, Technical Report No. 9, Jacobs University Bremen, 2007
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.