×

The universal approximation capabilities of double \(2\pi\)-periodic approximate identity neural networks. (English) Zbl 1359.68256

Summary: The purpose of this study is to investigate the universal approximation capabilities of a certain class of single-hidden-layer feedforward neural networks, which is called double \(2\pi\)-periodic approximate identity neural networks. Using double \(2\pi\)-periodic approximate identity, several theorems concerning the universal approximation capabilities of the networks are proved. The proofs of these theorems are sketched based on the double convolution linear operators and the definition of \(\epsilon\)-net. The obtained results are divided into two categories. First, the universal approximation capability of the networks is shown in the space of continuous bivariate \(2\pi\)-periodic functions. Then, universal approximation capability of the networks is extended to the space of pth-order Lebesgue-integrable bivariate \(2\pi\)-periodic functions. These results can be interpreted as an extension of the universal approximation capabilities established for single-hidden-layer feedforward neural networks.

MSC:

68T05 Learning and adaptive systems in artificial intelligence
41A30 Approximation by other special function classes
92B20 Neural networks for/in biological studies, artificial life and related topics
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Alexeev DV (2010) Neural-networks approximation of functions of several variables. J Math Sci 168:5-13 · Zbl 1288.41003 · doi:10.1007/s10958-010-9970-5
[2] Anastassiou AG (2011) Multivariate sigmoidal neural network approximation. Neural Netw 24:378-386 · Zbl 1228.65018 · doi:10.1016/j.neunet.2011.01.003
[3] Arteaga C, Marrero I (2013) Universal approximation by radial basis function networks of Delsarte translates. Neural Netw 46:299-305 · Zbl 1296.41016 · doi:10.1016/j.neunet.2013.06.011
[4] Attali JG, Pages G (1997) Approximations of functions by a multilayer perceptron: a new approach. Neural Netw 10:1069-1081 · doi:10.1016/S0893-6080(97)00010-5
[5] Back AD, Chen T (2002) Universal approximation of multiple operators by neural networks. Neural Comput 14:2561-2566 · Zbl 1057.68084 · doi:10.1162/089976602760407964
[6] Belic I (2006) Neural networks and modelling in vacuum science. Vacuum 80:1107-1122 · doi:10.1016/j.vacuum.2006.02.017
[7] Carroll SM, Dickinson BW (1989) Construction of feedforard neural nets using Radon transform. In: International Joint Conference on Neural Networks Part I, 607-611
[8] Castro JL, Mantas CJ, Benitez JM (2000) Neural networks with a continuous squashing function in the output are universal approximators. Neural Netw 13:561-563 · doi:10.1016/S0893-6080(00)00031-9
[9] Chen T, Chen H (1995) Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems. IEEE Trans Neural Netw 6:911-917 · doi:10.1109/72.392253
[10] Chen T, Chen H (1996) Universal approximation capability of EBF neural networks with arbitrary activation functions. Circ Systems Signal Process 15:671-683 · Zbl 0860.68086 · doi:10.1007/BF01188988
[11] Chen Z, Cao F (2013) The construction and approximation of neural networks operators with Gaussian activation function. Math Commun 18:185-207 · Zbl 1311.41007
[12] Courrieu P (2005) Function approximation on non-Euclidean spaces. Neural Netw 18:91-102 · Zbl 1085.68124 · doi:10.1016/j.neunet.2004.09.003
[13] Cybenko G (1989) Approximation by superpositions of a sigmoidal function. Math Control Signals Systems 2:303-314 · Zbl 0679.94019 · doi:10.1007/BF02551274
[14] Efe MO (2008) Novel neuronal activation functions for feedforward. Neural Process Lett 28:63-79 · doi:10.1007/s11063-008-9082-0
[15] Enachescu C (2008) Approximation capabilities of neural networks. J Numer Anal Ind Appl Math 3:221-230 · Zbl 1166.92301
[16] de Figueiredo RJP (1980) Implications and applications of Kolmogorov’s superposition theorem. IEEE Transa Autom Control 25:1227-1231 · Zbl 0471.93036 · doi:10.1109/TAC.1980.1102536
[17] Funahashi KI (1989) On the approximate realization of continuous mappings by neural networks. Neural Netw 2:183-192 · doi:10.1016/0893-6080(89)90003-8
[18] Gomes GS da S, Ludermir TB, Lima LMMR, (2011) Comparison of new activation functions in neural network for forecasting financial time series. Neural Comput Appl 20:417-439 · Zbl 1255.65044
[19] Guanzhen Z (2005) On the order of approximation by periodic neural networks based on scattered nodes. Appl Math A J Chin Univ 20:352-362 · Zbl 1081.68089 · doi:10.1007/s11766-005-0012-x
[20] Hahm N, Hong BI (2010) The capability of periodic neural networks approximation. Korean J Math 18:167-174
[21] Hammer B, Gersmann K (2003) A note on the universal approximation capability of support vector machines. Neural Process Lett 17:43-53 · doi:10.1023/A:1022936519097
[22] Hect-Nielsen R (1989) Kolmogorov’s mapping neural network existence theorem. Int Conf Neural Netw 3:11-14
[23] Hornik K (1989) Multilayer feedforward networks are universal approximators. Neural Netw 2:359-366 · Zbl 1383.92015 · doi:10.1016/0893-6080(89)90020-8
[24] Ismail A, Jeng D-S, Zhang LL, Zhang JS (2013) Predictions of bridge scour: application of a feed-forward neural network with an adaptive activation function. Eng Appl Artif Intell 26:1540-1549 · doi:10.1016/j.engappai.2012.12.011
[25] Ismailov VE (2012) Approximation by neural networks with weights varying on a finite set of directions. J Math Anal Appl 389:72-83 · Zbl 1348.41013 · doi:10.1016/j.jmaa.2011.11.037
[26] Jones F (1993) Lebesgue integration on Euclidean space. Jones and Bartlett · Zbl 0780.28001
[27] Kurkova (1992) Kolmogorov’s theorem and multilayer neural networks. Neural Netw 5:501-506
[28] Lenze B (1994) Note on a density question for neural networks. Numer Funct Anal Optim 15:909-913 · Zbl 0814.41018 · doi:10.1080/01630569408816599
[29] Lenze B (1999) Mathematics and neural networks—a glance at some basic connections. Acta Appl Math 55:303-311 · Zbl 0944.92002 · doi:10.1023/A:1006161218940
[30] Leshno M, Lin VY, Pinkus A, Schocken S (1993) Multilayer feedforward networks with a nonpolynomial activation function can approximate any function. Neural Netw 6:861-867 · doi:10.1016/S0893-6080(05)80131-5
[31] Li F, Xu Z (2007) The essential order of simultaneous approximation for neural networks. Appl Math Comput 194:120-127 · Zbl 1193.41027 · doi:10.1016/j.amc.2007.04.017
[32] Liao Y, Fang S-C, Nuttle HLW (2003) Relaxed conditions for radial-basis function networks to be universal approximators. Neural Netw 16:1019-1028 · Zbl 1255.65044 · doi:10.1016/S0893-6080(02)00227-7
[33] Lin S, Guo X, Cao F, Xu Z (2013) Approximation by neural networks with scattered data. Appl Math Comput 224:29-35 · Zbl 1334.68190 · doi:10.1016/j.amc.2013.08.014
[34] Long J, Wu W, Nan D \[(2007) L^p\] Lp approximation capabilities of sum-of-product and sigma-pi-sigma neural networks. Int J Neural Systems 17:419-424 · doi:10.1142/S0129065707001251
[35] Mhaskar HN (1993) Approximation properties of a multilayered feedforward artificial neural network. Adv Comput Math 1:61-80 · Zbl 0824.41011 · doi:10.1007/BF02070821
[36] Nakamura Y, Nakagawa M (2009) Approximation capability of continuous time recurrent neural networks for non-autonomous dynamical systems. ICANN 2009, Part II, LNCS 5769:59-602 · Zbl 1311.41015
[37] Panahian Fard S, Zainuddin Z (2014) Analyses for \[L^p[a, b]\] Lp[a,b]-norm approximation capability of flexible approximate identity neural networks. Neural Comput Appl 24:45-50 · doi:10.1007/s00521-013-1493-9
[38] Panahian Fard, S.; Zainuddin, Z.; Wong, WE (ed.); Ma, T. (ed.), The universal approximation capability of double flexible approximate identity neural networks, 125-133 (2014), Switzerland
[39] Panahian Fard S, Zainuddin Z (2013) On the universal approximation capability of flexible approximate identity neural networks. In: Wong WE, Ma T (eds) Emerging Technologies for Information Systems, Computing, and Management, LNEE 236, Part I. Springer, New York, pp 201-207 · Zbl 1359.68256
[40] Panahian Fard S, Zainuddin Z (2013) The universal approximation capabilities of Mellin approximate identity neural networks. In: Guo C, Hou Z-C, Zeng Z (eds) ISNN 2013, Part I, LNCS 7951. Springer, Berlin Heidelberg, pp 205-213 · Zbl 1359.68256
[41] Panahian Fard S, Zainuddin Z (2014) The universal approximation capabilities of \[2\pi\] π-periodic approximate identity neural networks. The 2013 International Conference on Information Science and Cloud Computing, 793-798, IEEE Press · Zbl 1359.68256
[42] Pinkus A (1999) Approximation theory of MLP model in neural networks. Acta Numer 8:143-195 · Zbl 0959.68109 · doi:10.1017/S0962492900002919
[43] Sanguineti M (2008) Universal approximation by ridge computational models and neural networks: a survey. Open Appl Math J 31:31-58 · Zbl 1322.68157 · doi:10.2174/1874114200802010031
[44] Scarcelli F, Tsoi AC (1998) Universal approximation using feedforward neural networks: a survey of some existing methods, and some new results. Neural Netw 11:15-37 · doi:10.1016/S0893-6080(97)00097-X
[45] Schafer AM, Zimmermann HG (2006) Recurrent neural networks are universal approximators. ICANN 2006, Part I, LNCS 4131:632-640
[46] Suzuki S (1998) Constructive function-approximation by three-layer artificial neural networks. Neural Netw 11:1049-1058 · doi:10.1016/S0893-6080(98)00068-9
[47] Tikk D, Koczy LT, Gedeon TD (2003) A survey on universal approximation and its limits in soft computing techniques. Int J Approx Reason 33:185-202 · Zbl 1045.68113 · doi:10.1016/S0888-613X(03)00021-5
[48] Turchetti C, Conti M, Crippa P, Orcioni S (1998) On the approximation of stochastic processes by approximate identity neural networks. IEEE Trans Neural Netw 9:1069-1085 · doi:10.1109/72.728353
[49] Wang J, Xu Z (2010) New study on neural networks: the essential order of approximation. Neural Netw 23:618-624 · Zbl 1398.65023 · doi:10.1016/j.neunet.2010.01.004
[50] Wu W, Nan D, Li Z, Long J (2007) Approximation to compact set of functions by feedforward neural networks. In: 20th International Joint Conference on Neural Networks, 1222-1225 · Zbl 0814.41018
[51] Yu DS (2013) Approximation by neural networks with sigmoidal functions. Acta Math Sincica 29:2013-2026 · Zbl 1311.41015 · doi:10.1007/s10114-013-1730-2
[52] Zainuddin Z, Panahian Fard S (2012) Double approximate identity neural networks universal approximation in real Lebesgue spaces. In: Huang T, Zeng Z, Li C, Leung CS (eds) ICONIP 2012, Part I, LNCS 7663. Springer, Heidelberg, pp 409-415 · Zbl 1311.41007
[53] Zainuddin Z, Panahian Fard S (2014) A study on the universal approximation capability of 2-spherical approximate identity neural networks. Advances in Mathematical Model and Production Systems in Engineering, 23-27, WSEAS Press
[54] Zhang Z, Liu K, Zhu L, Chen Y (2013) The new approximation operators with sigmoidal functions. J Appl Math Comput 42:455-468 · Zbl 1296.41018 · doi:10.1007/s12190-013-0643-7
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.