Estimation of approximating rate for neural network in \(L^p_w\) spaces. (English) Zbl 1244.93155

Summary: A class of Sobolev type multivariate function is approximated by feedforward network with one hidden layer of sigmoidal units and a linear output. By adopting a set of orthogonal polynomial basis and under certain assumptions for the governing activation functions of the neural network, the upper bound on the degree of approximation can be obtained for the class of Sobolev functions. The results obtained are helpful in understanding the approximation capability and topology construction of the sigmoidal neural networks.


93E10 Estimation and detection in stochastic control theory
92B20 Neural networks for/in biological studies, artificial life and related topics
Full Text: DOI


[1] G. Cybenko, “Approximation by superpositions of a sigmoidal function,” Mathematics of Control, Signals, and Systems, vol. 2, no. 4, pp. 303-314, 1989. · Zbl 0679.94019
[2] J. J. Wang, Z. B. Xu, and W. J. Xu, Approximation Bounds by Neural Networks in L(w, p), vol. 3173 of Lecture Notes in Computer Science, Springer, Berlin, Germany, 2004.
[3] J. J. Wang, B. Chen, and C. Yang, “Approximation of algebraic and trigonometric polynomials by feedforward neural networks,” Neural Computing and Applications, vol. 21, no. 1, pp. 73-80, 2011.
[4] J. J. Wang and Z. B. Xu, “New study of neural networks: the essential order of approximation,” Neural Networks, vol. 23, pp. 618-624, 2010. · Zbl 1267.34146
[5] Y. Ito, “Approximation of continuous functions on Rd by linear combination of shifted rotations of sigmoid function with and without scaling,” Neural Networks, vol. 5, no. 1, pp. 105-115, 1992.
[6] T. P. Chen and H. Chen, “Approximation capability to functions of several variables, nonlinear functions, and operators by radial function neural networks,” IEEE Transactions on Neural Networks, vol. 6, pp. 904-910, 1995.
[7] A. R. Barron, “Universal approximation bounds for superpositions of a sigmoidal function,” IEEE Transactions on Information Theory, vol. 39, no. 3, pp. 930-945, 1993. · Zbl 0818.68126
[8] M. Leshno, V. Ya. Lin, A. Pinkus, and S. Schocken, “Multilayer feedforward networks with a non-polynomial activation function and approximate any function,” Neural Networks, vol. 6, no. 6, pp. 861-867, 1993.
[9] H. N. Mhaskar, “Neural networks for optimal approximation for smooth and analytic functions,” Neural Computation, vol. 8, pp. 164-177, 1996.
[10] V. Maiorov and R. S. Meir, “Approximation bounds for smooth functions in C(Rd)by neural and mixture networks,” IEEE Transactions on Neural Networks, vol. 3, pp. 969-978, 1998.
[11] M. Burger and A. Neubauer, “Error bounds for approximation with neural networks,” Journal of Approximation Theory, vol. 112, no. 2, pp. 235-250, 2001. · Zbl 1004.41007
[12] V. Kurkova and M. Sanguineti, “Comparison of worst case errors in linear and neural network approximation,” IEEE Transactions on Information Theory, vol. 48, no. 1, pp. 264-275, 2002. · Zbl 1059.62589
[13] J. L. Wang, B. H. Sheng, and S. P. Zhou, “On approximation by non-periodic neural and translation networks in Lwp spaces,” ACTA Mathematica Sinica, vol. 46, pp. 65-74, 2003 (Chinese). · Zbl 1023.41010
[14] J. J. Wang, C. Yang, and J. Jing, “Approximation order for multivariate Durrmeyer operators with Jacobi weights,” Abstract and Applied Analysis, vol. 2011, Article ID 970659, 12 pages, 2011. · Zbl 1216.41019
[15] A. F. Timan, Theory of Approximation of Functions of a Real Variable, Macmillan, New York, NY, USA, 1963. · Zbl 0117.29001
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.