Modified Yabe-Takano nonlinear conjugate gradient method. (English) Zbl 1266.90127

Summary: Motivated by the idea of W. W. Hager and H. Zhang [SIAM J. Optim. 16, No. 1, 170–192 (2005; Zbl 1093.90085)], we propose a modified Yabe-Takano nonlinear conjugate gradient method. An attractive property of the modified method is that the directions generated by the method always descent. This property is independent of the line search used. Under some mild conditions, we prove that the modified method with Wolfe line search is globally convergent even if the objective function is nonconvex. By testing large-scale unconstrained optimization problems, the numerical results show that the modified method is competitive with the well-known CG-DESCENT method.


90C06 Large-scale problems in mathematical programming
90C26 Nonconvex programming, global optimization
65Y20 Complexity and performance of numerical algorithms


Zbl 1093.90085
Full Text: Link