Hager, William W.; Zhang, Hongchao A new conjugate gradient method with guaranteed descent and an efficient line search. (English) Zbl 1093.90085 SIAM J. Optim. 16, No. 1, 170-192 (2005). Summary: A new nonlinear conjugate gradient method and an associated implementation, based on an inexact line search, are proposed and analyzed. With exact line search, our method reduces to a nonlinear version of the Hestenes–Stiefel conjugate gradient scheme. For any (inexact) line search, our scheme satisfies the descent condition \({\mathbf g}_k^{T} {\mathbf d}_k \leq-\frac{7}{8}\|{\mathbf g}_k\|^2\). Moreover, a global convergence result is established when the line search fulfills the Wolfe conditions. A new line search scheme is developed that is efficient and highly accurate. Efficiency is achieved by exploiting properties of linear interpolants in a neighborhood of a local minimizer. High accuracy is achieved by using a convergence criterion, which we call the “approximate Wolfe” conditions, obtained by replacing the sufficient decrease criterion in the Wolfe conditions with an approximation that can be evaluated with greater precision in a neighborhood of a local minimum than the usual sufficient decrease criterion. Numerical comparisons are given with both L-BFGS and conjugate gradient methods using the unconstrained optimization problems in the CUTE library. Reviewer: Klaus Schittkowski (Bayreuth) Cited in 19 ReviewsCited in 462 Documents MSC: 90C52 Methods of reduced gradient type 90C06 Large-scale problems in mathematical programming Keywords:conjugate gradient method; unconstrained optimization; convergence; line search; Wolfe conditions; nonlinear programming; global convergence; CUTE Software:L-BFGS; NAPACK; CG_DESCENT × Cite Format Result Cite Review PDF Full Text: DOI Link