Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization. (English) Zbl 1170.65046

A new algoritm for acceleration of the conjugate gradient method is developed by computing the parameter \(\beta_k\) using a finite difference approximation of the Hessian/vector product. The search direction is also computed using the forward difference approximation of the Hessian/vector product. In contrast to the Newton and quasi Newton methods, for the conjugate gradient method the step lengths may differ from 1 depending how the problem is scaled. The authors suggests a modification of the step length which reduces number of function evaluations compared to the currently available conjugate gradient algorithms.
It is proved that the method is globally convergent and its convergence rate is linear for uniformly convex functions; but reduction of function values is significantly improved. The performance of the suggested method is demonstrated for unconstrained 750 large scale test problems by comparing it with the conjugate algorithms like CONMIN, SCALCG and the truncated Newton methods.


65K05 Numerical mathematical programming methods
90C06 Large-scale problems in mathematical programming
90C30 Nonlinear programming
Full Text: DOI


[1] Shanno, D.F.; Phua, K.H., Algorithm 500, minimization of unconstrained multivariate functions, ACM trans. math. softw., 2, 87-94, (1976) · Zbl 0319.65042
[2] Andrei, N., Scaled conjugate gradient algorithms for unconstrained optimization, Comput. optim. appl., 38, 401-416, (2007) · Zbl 1168.90608
[3] Andrei, N., Scaled memoryless BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Optim. methods softw., 22, 561-571, (2007) · Zbl 1270.90068
[4] Andrei, N., A scaled BFGS preconditioned conjugate gradient algorithm for unconstrained optimization, Appl. math. lett., 20, 645-650, (2007) · Zbl 1116.90114
[5] Li, G.; Tang, C.; Wei, Z., New conjugacy condition and related new conjugate gradient methods for unconstrained optimization, J. comput. appl. math., 202, 523-539, (2007) · Zbl 1116.65069
[6] Nash, S.G., Preconditioning of truncated-Newton methods, SIAM J. on scientific and statistical computing, 6, 599-616, (1985) · Zbl 0592.65038
[7] Wolfe, P., Convergence conditions for ascent methods, SIAM rev., 11, 226-235, (1969) · Zbl 0177.20603
[8] Wolfe, P., Convergence conditions for ascent methods II: some corrections, SIAM rev., 13, 185-188, (1971) · Zbl 0216.26901
[9] Hager, W.W.; Zhang, H., A survey of nonlinear conjugate gradient methods, Pacific J. optim., 2, 35-58, (2006) · Zbl 1117.90048
[10] Dolan, E.D.; Moré, J.J., Benchmarking optimization software with performance profiles, Math. program., 91, 201-213, (2002) · Zbl 1049.90004
[11] Andrei, N., Numerical comparison of conjugate gradient algorithms for unconstrained optimization, Stud. inf. control, 16, 333-352, (2007)
[12] Nocedal, J., Conjugate gradient methods and nonlinear optimization, (), 9-23 · Zbl 0866.65037
[13] Andrei, N., An unconstrained optimization test functions collection, Advanced modeling and optimization, Electron. internat. J., 10, 147-161, (2008) · Zbl 1161.90486
[14] Daniel, J.W., The conjugate gradient method for linear and nonlinear operator equations, SIAM J. numer. anal., 4, 10-26, (1967) · Zbl 0154.40302
[15] Dai, Y.H.; Han, J.Y.; Liu, G.H.; Sun, D.F.; Yin, X.; Yuan, Y., Convergence properties of nonlinear conjugate gradient methods, SIAM J. optim., 10, 348-358, (1999)
[16] Andrei, N., An acceleration of gradient descent algorithm with backtracking for unconstrained optimization, Numer. algorithms, 42, 63-73, (2006) · Zbl 1101.65058
[17] Liu, D.C.; Nocedal, J., On the limited memory BFGS method for large scale optimization methods, Math. program., 45, 503-528, (1989) · Zbl 0696.90048
[18] Goldstein, A.A., On steepest descent, SIAM J. control, 3, 147-151, (1965) · Zbl 0221.65094
[19] Bongartz, I.; Conn, A.R.; Gould, N.I.M.; Toint, Ph.L., CUTE: constrained and unconstrained testing environments, ACM trans. math. software, 21, 123-160, (1995) · Zbl 0886.65058
[20] Hestenes, M.R.; Stiefel, E.L., Methods of conjugate gradients for solving linear systems, J. res. natl. bur. stand., 49, 409-436, (1952) · Zbl 0048.09901
[21] Polak, E.; Ribière, G., Note sur la convergence de directions conjuguée, Rev. francaise informat recherche operationelle, 3e année, 16, 35-43, (1969) · Zbl 0174.48001
[22] Polyak, B.T., The conjugate gradient method in extreme problems, USSR comput. math. math. phys., 9, 94-112, (1969) · Zbl 0229.49023
[23] Dai, Y.H.; Yuan, Y., An efficient hybrid conjugate gradient method for unconstrained optimization, Ann. oper. res., 103, 33-47, (2001) · Zbl 1007.90065
[24] Dai, Y.H.; Liao, L.Z., New conjugacy conditions and related nonlinear conjugate gradient methods, Appl. math. optim., 43, 87-101, (2001) · Zbl 0973.65050
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.