×

A class \(+1\) sigmoidal activation functions for FFANNs. (English) Zbl 1185.68572

Summary: A limited number of activation functions have been utilized in practice for feedforward artificial neural networks. A class of sigmoidal functions is defined and another function, which is the envelope of the derivatives of the members of the defined class, is shown to be sigmoidal. The functions defined are shown to satisfy the requirements of the universal approximation theorem(s).

MSC:

68T05 Learning and adaptive systems in artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Barron, A. R., Universal approximation bounds for superpositions of a sigmoidal function, IEEE Transactions on Information Theory, 39, 3, 930-945 (1993) · Zbl 0818.68126
[2] Chen, T.; Chen, H.; Liu, R.-W., Approximation capability in \(C( R̄^n)\) by multiplayer feedforward networks and related problems, IEEE Transactions on Neural Networks, 6, 1, 25-35 (1995)
[3] Cheney, W., Light, W., 2000. A Course in Approximation Theory. Brooks/Cole, CA.; Cheney, W., Light, W., 2000. A Course in Approximation Theory. Brooks/Cole, CA. · Zbl 1167.41001
[4] Cybenko, G., Approximation by superposition of a sigmoidal function, Mathematics of Control, Signals and Systems, 2, 4, 303-314 (1989) · Zbl 0679.94019
[5] Duch, W.; Janhowski, N., Survey of neural transfer functions, Neural Computing Survey, 2, 163-212 (1999)
[6] Funahashi, K., On the approximate realization of continuous mappings by neural networks, Neural Networks, 2, 3, 183-192 (1989)
[7] Haykin, S., Neural Networks: A Comprehensive Foundation (1999), Prentice-Hall Inc: Prentice-Hall Inc New Jersey · Zbl 0934.68076
[8] Hornik, K.; Stinchcombe, M.; White, H., Multilayer feedforward networks are universal approximators, Neural Networks, 2, 3, 359-366 (1989) · Zbl 1383.92015
[9] Hornik, K.; Stinchcombe, M.; White, H., Universal approximation of an unknown mapping and its derivatives using multiplayer feedforward networks, Neural Networks, 3, 5, 551-561 (1990)
[10] Jones, L. K., Constructive approximations for neural networks by sigmoidal functions, Proceedings of the IEEE, 78, 10, 1586-1589 (1990)
[11] Kolen, J.F., Pollock, J.B., 1990. Back propagation is sensitive to initial conditions. Technical Report TR 90-JK-BPSIC, Computer and Information Science Department, The Ohio State University, Ohio.; Kolen, J.F., Pollock, J.B., 1990. Back propagation is sensitive to initial conditions. Technical Report TR 90-JK-BPSIC, Computer and Information Science Department, The Ohio State University, Ohio.
[12] LeCun, Y.; Battou, L.; Orr, G. B.; Muller, K.-R., Efficient BackProp, (Orr, G.; Muller, K.-R., Neural Networks-Tricks of the Trade (1998), Springer: Springer Berlin), 5-50
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.