×

Accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization. (English) Zbl 1189.90151

Summary: An accelerated scaled memoryless BFGS preconditioned conjugate gradient algorithm for solving unconstrained optimization problems is presented. The basic idea is to combine the scaled memoryless BFGS method and the preconditioning technique in the frame of the conjugate gradient method. The preconditioner, which is also a scaled memoryless BFGS matrix, is reset when the Beale-Powell restart criterion holds. The parameter scaling the gradient is selected as a spectral gradient. For the steplength computation the method has the advantage that in conjugate gradient algorithms the step lengths may differ from 1 by two order of magnitude and tend to vary unpredictably. Thus, we suggest an acceleration scheme able to improve the efficiency of the algorithm. Under common assumptions, the method is proved to be globally convergent. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in the function values is significantly improved. In mild conditions the algorithm is globally convergent for strongly convex functions. Computational results for a set consisting of 750 unconstrained optimization test problems show that this new accelerated scaled conjugate gradient algorithm substantially outperforms known conjugate gradient methods.

MSC:

90C30 Nonlinear programming
90C52 Methods of reduced gradient type
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Andrei, N., An unconstrained optimization test functions collection, Advanced modeling and optimization. an electronic international journal, 10, 147-161, (2008) · Zbl 1161.90486
[2] Andrei, N., An acceleration of gradient descent algorithm with backtracking for unconstrained optimization, Numerical algorithms, 42, 63-73, (2006) · Zbl 1101.65058
[3] Andrei, N., Scaled conjugate gradient algorithms for unconstrained optimization, Computational optimization and applications, 38, 401-416, (2007) · Zbl 1168.90608
[4] Andrei, N., Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Optimization methods and software, 22, 561-571, (2007) · Zbl 1270.90068
[5] Andrei, N., A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Applied mathematics letters, 20, 645-650, (2007) · Zbl 1116.90114
[6] Andrei, N., A scaled nonlinear conjugate gradient algorithm for unconstrained optimization, Optimization. A journal of mathematical programming and operations research, 57, 549-570, (2008) · Zbl 1338.90457
[7] Andrei, N., A dai – yuan conjugate gradient algorithm with sufficient descent and conjugacy conditions for unconstrained optimization, Applied mathematics letters, 21, 165-171, (2008) · Zbl 1165.90683
[8] Andrei, N., Numerical comparison of conjugate gradient algorithms for unconstrained optimization, Studies in informatics and control, 16, 333-352, (2007)
[9] Andrei, N., Acceleration of conjugate gradient algorithms for unconstrained optimization, Applied mathematics and computation, 213, 361-369, (2009) · Zbl 1172.65027
[10] N. Andrei, Accelerated hybrid conjugate gradient algorithm with modified secant condition for unconstrained optimization, Numerical Algorithms, doi:10.1007/s11075009-9321-0. · Zbl 1192.65074
[11] Birgin, E.; Martínez, J.M., A spectral conjugate gradient method for unconstrained optimization, Applied mathematics and optimization, 43, 117-128, (2001) · Zbl 0990.90134
[12] Bongartz, I.; Conn, A.R.; Gould, N.I.M.; Toint, P.L., CUTE: constrained and unconstrained testing environments, ACM transactions on mathematical software, 21, 123-160, (1995) · Zbl 0886.65058
[13] Dai, Y.H., New properties of a nonlinear conjugate gradient method, Numerische Mathematik, 89, 83-98, (2001) · Zbl 1006.65063
[14] Dai, Y.H.; Liao, L.Z., New conjugate conditions and related nonlinear conjugate gradient methods, Applied mathematics and optimization, 43, 87-101, (2001) · Zbl 0973.65050
[15] Dai, Y.H.; Yuan, Y., Convergence properties of the beale – powell restart algorithm, Science in China (series A), 41, 1142-1150, (1998) · Zbl 0919.65042
[16] Dai, Y.H.; Yuan, Y., Global convergence of the method of shortest residuals, Numerische Mathematik, 83, 581-598, (1999) · Zbl 0949.65065
[17] Dai, Y.H.; Yuan, Y., An efficient hybrid conjugate gradient method for unconstrained optimization, Annals operations research, 103, 33-47, (2001) · Zbl 1007.90065
[18] Daniel, J.W., The conjugate gradient method for linear and nonlinear operator equations, SIAM journal on numerical analysis, 4, 10-26, (1967) · Zbl 0154.40302
[19] Dolan, E.D.; Moré, J.J., Benchmarking optimization software with performance profiles, Mathematical programming, 91, 201-213, (2002) · Zbl 1049.90004
[20] Fletcher, R.; Reeves, C.M., Function minimization by conjugate gradients, Computer journal, 7, 149-154, (1964) · Zbl 0132.11701
[21] Gilbert, J.Ch.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM journal on optimization, 2, 21-42, (1992) · Zbl 0767.90082
[22] Hager, W.W.; Zhang, H., A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM journal on optimization, 16, 170-192, (2005) · Zbl 1093.90085
[23] Hager, W.W.; Zhang, H., Algorithm 851: CG-DESCENT, a conjugate gradient method with guaranteed descent, ACM transactions on mathematical software, 32, 113-137, (2006) · Zbl 1346.90816
[24] Hager, W.W.; Zhang, H., A survey of nonlinear conjugate gradient methods, Pacific journal of optimization, 2, 35-58, (2006) · Zbl 1117.90048
[25] Hestenes, M.R.; Stiefel, E., Methods of conjugate gradients for solving linear systems, Journal of research of the national bureau of standards section B, 48, 409-436, (1952) · Zbl 0048.09901
[26] Liu, D.C.; Nocedal, J., On the limited memory BFGS method for large scale optimization methods, Mathematical programming, 45, 503-528, (1989) · Zbl 0696.90048
[27] Nash, S.G., Preconditioning of truncated-Newton methods, SIAM journal on scientific and statistical computing, 6, 599-616, (1985) · Zbl 0592.65038
[28] Nocedal, J., Conjugate gradient methods and nonlinear optimization, (), 9-23 · Zbl 0866.65037
[29] Oren, S.S.; Luenberger, D.G., Self-scaling variable metric algorithm. part I, Management science, 20, 845-862, (1976) · Zbl 0316.90064
[30] Oren, S.S.; Spedicato, E., Optimal conditioning of self-scaling variable metric algorithms, Mathematical programming, 10, 70-90, (1976) · Zbl 0342.90045
[31] J.M. Perry, A class of conjugate gradient algorithms with a two step variable metric memory, Discussion Paper 269, Center for Mathematical Studies in Economics and Management Science, Northwestern University, 1977.
[32] Polak, E.; Ribière, G., Note sur la convergence de methods de directions conjugres, Revue francaise informat reserche opérationnelle, 16, 35-43, (1969) · Zbl 0174.48001
[33] Polyak, B.T., The conjugate gradient method in extreme problems, USSR computational mathematics and mathematical physics, 9, 94-112, (1969) · Zbl 0229.49023
[34] Powell, M.J.D., Some convergence properties of the conjugate gradient method, Mathematical programming, 11, 42-49, (1976) · Zbl 0356.65056
[35] Powell, M.J.D., Restart procedures for the conjugate gradient method, Mathematical programming, 12, 241-254, (1977) · Zbl 0396.90072
[36] Pytlak, R., On the convergence of conjugate gradient algorithms, IMA journal of numerical analysis, 14, 443-460, (1994) · Zbl 0830.65052
[37] Pytlak, R., Conjugate gradient algorithms in nonconvex optimization, (2009), Springer Heidelberg · Zbl 1171.49002
[38] Raydan, M., The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem, SIAM journal on optimization, 7, 26-33, (1997) · Zbl 0898.90119
[39] Shanno, D.F., Conditioning of quasi-Newton methods for function minimization, Mathematics of computation, 24, 647-657, (1970) · Zbl 0225.65073
[40] Shanno, D.F., Conjugate gradient methods with inexact searches, Mathematics of operations research, 3, 244-256, (1978) · Zbl 0399.90077
[41] Shanno, D.F., On the convergence of a new conjugate gradient algorithm, SIAM journal on numerical analysis, 15, 1247-1257, (1978) · Zbl 0408.90071
[42] Shanno, D.F.; Phua, K.H., Algorithm 500: minimization of unconstrained multivariate functions, ACM transactions on mathematical software, 2, 87-94, (1976) · Zbl 0319.65042
[43] Shanno, D.F.; Phua, K.H., Matrix conditioning and nonlinear optimization, Mathematical programming, 14, 149-160, (1978) · Zbl 0371.90109
[44] Wolfe, P., Convergence conditions for ascent methods, SIAM review, 11, 226-235, (1969) · Zbl 0177.20603
[45] Wolfe, P., Convergence conditions for ascent methods II: some corrections, SIAM review, 13, 185-188, (1971) · Zbl 0216.26901
[46] Zhang, J.Z.; Deng, N.Y.; Chen, L.H., New quasi-Newton equation and related methods for unconstrained optimization, Journal of optimization theory and applications, 102, 147-167, (1999) · Zbl 0991.90135
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.