×

zbMATH — the first resource for mathematics

A class of gradient unconstrained minimization algorithms with adaptive stepsize. (English) Zbl 0958.65072
The authors present a class of gradient algorithms with adaptive stepsize for the unconstrained minimization problem \[ f(x)\to \underset{x\in \mathbb{R}^n} {\text{minimum}}. \] The proposed class of algorithms comprises four algorithm: the first two incorporate techniques for the adaptation of a common stepsize for all coordinate directions and the other two allow an individual adaptive stepsize along each coordinate direction.
Numerical tests are given.

MSC:
65K05 Numerical mathematical programming methods
90C30 Nonlinear programming
Software:
minpack; BRENT; OPTAC
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Androulakis, G.S.; Vrahatis, M.N., Optaca portable software package for analyzing and comparing optimization methods by visualization, J. comput. appl. math., 72, 41-62, (1996) · Zbl 0857.65066
[2] Androulakis, G.S.; Vrahatis, M.N.; Grapsa, T.N., Studying the performance of optimization methods by visualization, Systems anal. model. simulation, 25, 21-42, (1996) · Zbl 0933.90061
[3] Armijo, L., Minimization of function having Lipschitz continuous first partial derivatives, Pacific J. math., 16, 1-3, (1966) · Zbl 0202.46105
[4] O. Axelsson, Iterative Solution Methods, Cambridge University Press, New York, 1996. · Zbl 0845.65011
[5] Blum, E.K., Approximation of Boolean functions by sigmoidal networkspart I: XOR and other two-variable functions, Neural comput., 1, 532-540, (1989)
[6] R.P. Brent, Algorithms for Minimization Without Derivatives, Prentice-Hall, Inc., Englewood Cliffs, New Jersey, 1973. · Zbl 0245.65032
[7] Brewster, M.E.; Kannan, R., Nonlinear successive over-relaxation, Numer. math., 44, 309-315, (1984) · Zbl 0566.65045
[8] P. Brodatz, Textures — A Photographic Album for Artists and Designers, Dover, New York, 1966.
[9] R.H. Byrd, J. Nocedal, A tool for the analysis of quasi-Newton methods with application to unconstrained minimization, SIAM J. Numer. Anal. (1989) 727-739. · Zbl 0676.65061
[10] Curry, H.B., The method of steepest descent for non-linear minimization problems, Quart. appl. math., 2, 258-261, (1944) · Zbl 0061.26801
[11] Cauchy, A., Méthode générale pour la résolution des systèmes d’équations simultanées, Comp. rend. acad. sci. Paris, 25, 536-538, (1847)
[12] E.K.P. Chong, S.H. Żak, An Introduction to Optimization, Wiley, New York, 1996. · Zbl 0865.90114
[13] J.E. Dennis Jr., R.B. Schnabel, Numerical Methods for Unconstrained Optimization and Nonlinear Equations, Prentice-Hall, Inc., Englewood Cliffs, NJ, 1983.
[14] Fletcher, R.; Reeves, C., Function minimization by conjugate gradients, Comput. J., 7, 149-154, (1964) · Zbl 0132.11701
[15] Gilbert, J.C.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. optim., 2, 21-42, (1992) · Zbl 0767.90082
[16] Goldstein, A.A., Cauchy’s method of minimization, Numer. math., 4, 146-150, (1962) · Zbl 0105.10201
[17] Goldstein, A.A., On steepest descent, SIAM J. control, 3, 147-151, (1965) · Zbl 0221.65094
[18] Goldstein, A.A.; Price, J.F., An effective algorithm for minimization, Numer. math., 10, 184-189, (1967) · Zbl 0161.35402
[19] Grapsa, T.N.; Vrahatis, M.N., A dimension-reducing method for unconstrained optimization, J. comput. appl. math., 66, 239-253, (1996) · Zbl 0856.65074
[20] A.V. Fiacco, G.P. McCormick, Nonlinear Programming: Sequential Unconstrained Minimization Techniques, SIAM, Philadelphia, 1990. · Zbl 0713.90043
[21] Haralick, R.; Shanmugan, K.; Dinstein, I., Textural features for image classification, IEEE trans. system, man cybernet., 3, 610-621, (1973)
[22] S. Haykin, Neural Networks: A Comprehensive Foundation, Macmillan College Publishing Company, New York, 1994. · Zbl 0828.68103
[23] Heller, D., A survey of parallel algorithms in numerical linear algebra, SIAM rev., 20, 740-777, (1978) · Zbl 0408.68033
[24] Jacobs, R.A., Increased rates of convergence through learning rate adaptation, Neural networks, 1, 295-307, (1988)
[25] G.D. Magoulas, M.N. Vrahatis, G.S. Androulakis, A new method in neural network supervised training with imprecision, Proceedings of the IEEE Third International Conference on Electronics, Circuits and Systems, 1996, pp. 287-290. · Zbl 0884.68106
[26] Magoulas, G.D.; Vrahatis, M.N.; Androulakis, G.S., Effective backpropagation training with variable stepsize, Neural networks, 10, 69-82, (1997) · Zbl 0891.68090
[27] G.D. Magoulas, M.N. Vrahatis, G.S. Androulakis, Improving the convergence of the backpropagation algorithm using learning rate adaptation methods, Neural Computation 11 (1999) 1769-1796.
[28] Moré, B.J.; Garbow, B.S.; Hillstrom, K.E., Testing unconstrained optimization, ACM trans. math. software, 7, 17-41, (1981) · Zbl 0454.65049
[29] Nocedal, J., Theory of algorithms for unconstrained optimization, Acta numerica, 1, 199-242, (1992) · Zbl 0766.65051
[30] Ohanian, P.P.; Dubes, R.C., Performance evaluation for four classes of textural features, Pattern recognition, 25, 819-833, (1992)
[31] J.M. Ortega, W.C. Rheinboldt, Iterative Solution of Nonlinear Equations in Several Variables, Academic Press, New York, 1970. · Zbl 0241.65046
[32] Ortega, J.M.; Voigt, R.G., Solution of partial differential equations on vector and parallel computers, SIAM rev., 27, 149-270, (1985) · Zbl 0644.65075
[33] E. Polak, Computational Methods in Optimization, Academic Press, New York, 1971.
[34] E. Polak, Optimization: Algorithms and Consistent Approximations, Springer, New York, 1997. · Zbl 0899.90148
[35] Powell, M.J.D., Direct search algorithms for optimization calculations, Acta numerica, 7, 287-336, (1998) · Zbl 0911.65050
[36] S.S. Rao, Optimization, Theory and Applications 2nd Edition, 7th reprint, Wiley Eastern Limited, New Delhi, 1992.
[37] D.E. Rumelhart, G.E. Hinton, R.J. Williams, Learning internal representations by error propagation, in: D.E. Rumelhart, J.L. McClelland (Eds.), Parallel Distributed Processing: Explorations in the Microstructure of Cognition, Vol. 1, MIT Press, 1986, pp. 318-362.
[38] Sperduti, A.; Starita, A., Speed up learning and network optimization with extended back-propagation, Neural networks, 6, 365-383, (1993)
[39] G.W. Stewart, Introduction to Matrix Computations, Academic Press, New York, 1973. · Zbl 0302.65021
[40] Strang, J.; Taxt, T., Local frequency features for texture classification, Pattern recognition, 27, 1397-1406, (1994)
[41] R. Varga, Matrix Iterative Analysis, Prentice-Hall, Inc., Englewood Cliffs, NJ, 1962. · Zbl 0133.08602
[42] Vogl, T.P.; Mangis, J.K.; Rigler, A.K.; Zink, W.T.; Alkon, D.L., Accelerating the convergence of the back-propagation method, Biol. cybernet., 59, 257-263, (1988)
[43] Voigt, R.G., Rates of convergence for a class of iterative procedures, SIAM J. numer. anal., 8, 127-134, (1971) · Zbl 0232.65044
[44] Vrahatis, M.N.; Androulakis, G.S.; Manoussakis, G.E., A new unconstrained optimization method for imprecise function and gradient values, J. math. anal. appl., 197, 586-607, (1996) · Zbl 0887.90166
[45] Wessel, L.F.; Barnard, E., Avoiding false local minima by proper initialization of connections, IEEE trans. neural networks, 3, 899-905, (1992)
[46] Wolfe, P., Convergence conditions for ascent methods, SIAM rev., 11, 226-235, (1969) · Zbl 0177.20603
[47] Wolfe, P., Convergence conditions for ascent methods iisome corrections, SIAM rev., 13, 185-188, (1971) · Zbl 0216.26901
[48] Young, D., Iterative methods for solving partial difference equations of elliptic type, Trans. amer. math. soc., 76, 92-111, (1954) · Zbl 0055.35704
[49] G. Zoutendijk, Nonlinear programming, computational methods, in: J. Abadie (Ed.), Integer and Nonlinear Programming, North-Holland, Amsterdam, 1970, pp. 37-86. · Zbl 0336.90057
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.