×

zbMATH — the first resource for mathematics

Chaosnet: a chaos based artificial neural network architecture for classification. (English) Zbl 1427.92005
Summary: Inspired by chaotic firing of neurons in the brain, we propose ChaosNet – a novel chaos based artificial neural network architecture for classification tasks. ChaosNet is built using layers of neurons, each of which is a 1D chaotic map known as the Generalized Luröth Series (GLS) that has been shown in earlier works to possess very useful properties for compression, cryptography, and for computing XOR and other logical operations. In this work, we design a novel learning algorithm on ChaosNet that exploits the topological transitivity property of the chaotic GLS neurons. The proposed learning algorithm gives consistently good performance accuracy in a number of classification tasks on well known publicly available datasets with very limited training samples. Even with as low as seven (or fewer) training samples/class (which accounts for less than 0.05% of the total available data), ChaosNet yields performance accuracies in the range of \(73.89 \% - 98.33 \% \). We demonstrate the robustness of ChaosNet to additive parameter noise and also provide an example implementation of a two layer ChaosNet for enhancing classification accuracy. We envisage the development of several other novel learning algorithms on ChaosNet in the near future.
©2019 American Institute of Physics
MSC:
92B20 Neural networks for/in biological studies, artificial life and related topics
68T05 Learning and adaptive systems in artificial intelligence
37D45 Strange attractors, chaotic dynamics of systems with hyperbolic behavior
Software:
Keras; MNIST; Scikit; UCI-ml
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Faure, P.; Korn, H., Is there chaos in the brain? I. Concepts of nonlinear dynamics and methods of investigation, C. R. de l’Académie des Sci. Ser. III Sci. de la Vie, 324, 773-793 (2001)
[2] Korn, H.; Faure, P., Is there chaos in the brain? II. Experimental evidence and related models, C. R. Biol., 326, 787-840 (2003)
[3] Fan, Y.; Holden, A. V., Bifurcations, burstings, chaos and crises in the Rose-Hindmarsh model for neuronal activity, Chaos, Solitons Fractals, 3, 439-449 (1993) · Zbl 0777.92003
[4] Ding, Y.; Sohn, J. H.; Kawczynski, M. G.; Trivedi, H.; Harnish, R.; Jenkins, N. W.; Lituiev, D.; Copeland, T. P.; Aboian, M. S.; Mari Aparici, C., A deep learning model to predict a diagnosis of Alzheimer disease by using 18f-FDG PET of the brain, Radiology, 290, 456-464 (2018)
[5] Harikrishnan, N., Vinayakumar, R., and Soman, K., “A machine learning approach towards phishing email detection,” in Proceedings of the Anti-Phishing Pilot at ACM International Workshop on Security and Privacy Analytics (IWSPA AP) (2018), Vol. 2013, pp. 455-468.
[6] Graves, A., Mohamed, A.-R., and Hinton, G., “Speech recognition with deep recurrent neural networks,” in 2013 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP) (IEEE, 2013), pp. 6645-6649.
[7] Saxe, A. M., Bansal, Y., Dapello, J., Advani, M., Kolchinsky, A., Tracey, B. D., and Cox, D. D., “On the information bottleneck theory of deep learning,” in International Conference on Learning Representations (2018).
[8] Tishby, N. and Zaslavsky, N., “Deep learning and the information bottleneck principle,” in 2015 IEEE Information Theory Workshop (ITW) (IEEE, 2015), pp. 1-5.
[9] Delahunt, C. B.; Kutz, J. N.
[10] Aihara, K.; Takabe, T.; Toyoda, M., Chaotic neural networks, Phys. Lett. A, 144, 333-340 (1990)
[11] Crook, N. and Scheper, T. O., “A novel chaotic neural network architecture,” in ESANN’2001 Proceedings—European Symposium on Artificial Neural Networks Bruges (Belgium), 25-27 April 2001 (D-Facto, 2001), pp. 295-300.
[12] Freeman, W. J., Mass Action in the Nervous System (1975), Academic Press
[13] Chang, H.-J.; Freeman, W. J., Parameter optimization in models of the olfactory neural system, Neural Netw., 9, 1-14 (1996)
[14] Kozma, R. and Freeman, W. J., “A possible mechanism for intermittent oscillations in the kiii model of dynamic memories-the case study of olfaction,” in IJCNN’99 International Joint Conference on Neural Networks (Cat. No. 99CH36339) (IEEE, 1999), Vol. 1, pp. 52-57.
[15] Tsuda, I., Dynamic link of memory–chaotic memory map in nonequilibrium neural networks, Neural Netw., 5, 313-326 (1992)
[16] Nicolis, J. S.; Tsuda, I., Chaotic dynamics of information processing: The “magic number seven plus-minus two” revisited, Bull. Math. Biol., 47, 343-365 (1985) · Zbl 0575.92021
[17] Kaneko, K., Lyapunov analysis and information flow in coupled map lattices, Phys. D Nonlinear Phenom., 23, 436-447 (1986)
[18] Kaneko, K., Clustering, coding, switching, hierarchical ordering, and control in a network of chaotic elements, Phys. D Nonlinear Phenom., 41, 137-172 (1990) · Zbl 0709.58520
[19] Kathpalia, A. and Nagaraj, N., “A novel compression based neuronal architecture for memory encoding,” in Proceedings of the 20th International Conference on Distributed Computing and Networking (ACM, 2019), pp. 365-370.
[20] Aram, Z.; Jafari, S.; Ma, J.; Sprott, J. C.; Zendehrouh, S.; Pham, V.-T., Using chaotic artificial neural networks to model memory in the brain, Commun. Nonlinear Sci. Numerical Simul., 44, 449-459 (2017)
[21] Alligood, K. T.; Sauer, T. D.; Yorke, J. A., Chaos (1996), Springer · Zbl 0867.58043
[22] Deterministic chaos is characterized by the “ Butterfly Effect”—sensitive dependence of behavior to minute changes in initial conditions.
[23] Babloyantz, A.; Lourenço, C., Brain chaos and computation, Int. J. Neural Syst., 7, 461-471 (1996)
[24] Barras, C., Mind maths: Brainquakes on the edge of chaos, New Scientist, 217, 36 (2013)
[25] Elbert, T.; Rockstroh, B.; Kowalik, Z. J.; Hoke, M.; Molnar, M.; Skinner, J. E.; Birbaumer, N., Chaotic brain activity, Electroencephalogr. Clin. Neurophysiol./Suppl., 44, 441-449 (1995)
[26] Sprott, J., Is chaos good for learning?, Nonlinear Dyn. Psychol. Life Sci., 17, 223-232 (2013)
[27] Baghdadi, G.; Jafari, S.; Sprott, J.; Towhidkhah, F.; Golpayegani, M. H., A chaotic model of sustaining attention problem in attention deficit disorder, Commun. Nonlinear Sci. Numer. Simul., 20, 174-185 (2015) · Zbl 1304.37064
[28] Hodgkin, A. L.; Huxley, A. F., A quantitative description of membrane current and its application to conduction and excitation in nerve, J. Phys., 117, 500-544 (1952)
[29] Hindmarsh, J. L.; Rose, R., A model of neuronal bursting using three coupled first order differential equations, Proc. R. Soc. Lond. B Biol. Sci., 221, 87-102 (1984)
[30] Fitzhugh, R., Impulses and physiological states in theoretical models of nerve membrane, Biophys. J., 1, 445-466 (1961)
[31] Nagumo, J.; Arimoto, S.; Yoshizawa, S., An active pulse transmission line simulating nerve axon, Proc. IRE, 50, 2061-2070 (1962)
[32] Zerroug, A.; Terrissa, L.; Faure, A., Chaotic dynamical behavior of recurrent neural network, Annu. Rev. Chaos Theory Bifurc. Dyn. Syst., 4, 55-66 (2013)
[33] Harikrishnan, N. B.; Nagaraj, N.
[34] Staudemeyer, R. C.; Omlin, C. W., Extracting salient features for network intrusion detection using machine learning methods, S. Afr. Comput. J., 52, 82-96 (2014)
[35] Dajani, K.; Kraaikamp, C., Ergodic Theory of Numbers (2002), Cambridge University Press · Zbl 1033.11040
[36] Let \(X = \{x_0, x_1, x_2, \ldots \}\) be the trajectory of a chaotic map with initial condition \(x_0\), where \(x_i \in [U, V)\). The interval \([U, V)\) is partitioned into \(k\) sub intervals denoted as \(I_0, I_1, \ldots, I_{k - 1}\). If \(x_i \in I_j\), then we denote \(x_i\) by the symbol \(j \in \{0, 1, \ldots, k - 1 \}\) The new sequence of symbol \(\{j_0, j_1, \ldots, j_{k - 1} \}\) is the symbolic sequence of the trajectory of \(X\).
[37] Generating Markov Partition or GMP is based on splitting the state space into a complete set of disjoint regions, namely, it covers all state space and enables associating a one-to-one correspondence between trajectories and itinerary sequences of symbols (L and R) without losing any information?.
[38] Nagaraj, N.
[39] Nagaraj, N.; Vaidya, P. G.; Bhat, K. G., Arithmetic coding as a non-linear dynamical system, Commun. Nonlinear Sci. Numer. Simul., 14, 1013-1020 (2009) · Zbl 1221.94037
[40] Nagaraj, N., Using cantor sets for error detection, PeerJ Comput. Sci., 5, e171 (2019)
[41] Wong, K.-W.; Lin, Q.; Chen, J., Simultaneous arithmetic coding and encryption using chaotic maps, IEEE Trans. Circuits Syst. II Express Briefs, 57, 146-150 (2010)
[42] Gerstner, W.; Kistler, W. M.; Naud, R.; Paninski, L., Neuronal Dynamics: From Single Neurons to Networks and Models of Cognition (2014), Cambridge University Press
[43] For a nonconstant matrix \(X\), normalization is achieved by performing \(\frac{X - \text{min}(X)}{\text{max}(X) - \text{min}(X)} \). A constant matrix \(X\) is normalized to all ones.
[44] Cybenko, G., Approximation by superpositions of a sigmoidal function, Math. Control Signals Syst., 2, 303-314 (1989) · Zbl 0679.94019
[45] Lecun, Y. and Cortes, C., See for “MNIST Handwritten Digit Database” (2010).
[46] Cup, K., See for “Data (1999)” (1999).
[47] Lippmann, R. P., Fried, D. J., Graf, I., Haines, J. W., Kendall, K. R., Mcclung, D., Weber, D., Webster, S. E., Wyschogrod, D., Cunningham, R. K.et al., “Evaluating intrusion detection systems: The 1998 darpa off-line intrusion detection evaluation,” in Proceedings DARPA Information Survivability Conference and Exposition. DISCEX’00 (IEEE, 2000), Vol. 2, pp. 12-26.
[48] See for database for Iris data set.
[49] Blake, C. L. and Merz, C. J., See for “UCI Repository of Machine Learning Databases” (1998).
[50] See for “The habitable Exoplanet Catalog.”
[51] M’Endez, A., “The night sky of exo-planets,” Hipparcos catalog (2011).
[52] Saha, S.; Nagaraj, N.; Mathur, A.; Yedida, R.
[53] Saha, S.; Basak, S.; Safonova, M.; Bora, K.; Agrawal, S.; Sarkar, P.; Murthy, J., Theoretical validation of potential habitability via analytical and boosted tree methods: An optimistic study on recently discovered exoplanets, Astron. Comput., 23, 141-150 (2018)
[54] Quinlan, J. R., Induction of decision trees, Mach. Learn., 1, 81-106 (1986)
[55] Cover, T. M.; Hart, P., Nearest neighbor pattern classification, IEEE Trans. Inf. Theory, 13, 21-27 (1967) · Zbl 0154.44505
[56] Hearst, M. A.; Dumais, S. T.; Osuna, E.; Platt, J.; Scholkopf, B., Support vector machines, IEEE Intell. Syst. Appl., 13, 18-28 (1998)
[57] Lecun, Y.; Bengio, Y.; Hinton, G., Deep learning, Nature, 521, 436 (2015)
[58] Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; Vanderplas, J.; Passos, A.; Cournapeau, D.; Brucher, M.; Perrot, M.; Duchesnay, E., Scikit-learn: Machine learning in python, J. Mach. Learn. Res., 12, 2825-2830 (2011) · Zbl 1280.68189
[59] Chollet, F.et al., See for “Keras” (2015).
[60] Bora, K.; Saha, S.; Agrawal, S.; Safonova, M.; Routh, S.; Narasimhamurthy, A., Cd-hpf: New habitability score via data analytic modeling, Astron. Computing, 17, 129-143 (2016)
[61] Hyperparameters are rarely subjected to noise and hence we ignore this scenario. It is always possible to protect the hyperparameters by using strong error correction codes.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.