Wen, Fenghua; Dai, Zhifeng Modified Yabe-Takano nonlinear conjugate gradient method. (English) Zbl 1266.90127 Pac. J. Optim. 8, No. 2, 347-360 (2012). Summary: Motivated by the idea of W. W. Hager and H. Zhang [SIAM J. Optim. 16, No. 1, 170–192 (2005; Zbl 1093.90085)], we propose a modified Yabe-Takano nonlinear conjugate gradient method. An attractive property of the modified method is that the directions generated by the method always descent. This property is independent of the line search used. Under some mild conditions, we prove that the modified method with Wolfe line search is globally convergent even if the objective function is nonconvex. By testing large-scale unconstrained optimization problems, the numerical results show that the modified method is competitive with the well-known CG-DESCENT method. Cited in 4 Documents MSC: 90C06 Large-scale problems in mathematical programming 90C26 Nonconvex programming, global optimization 65Y20 Complexity and performance of numerical algorithms Keywords:unconstrained optimization; conjugate gradient method; sufficient descent property; global convergence Citations:Zbl 1093.90085 PDF BibTeX XML Cite \textit{F. Wen} and \textit{Z. Dai}, Pac. J. Optim. 8, No. 2, 347--360 (2012; Zbl 1266.90127) Full Text: Link