×

zbMATH — the first resource for mathematics

Global convergence properties of nonlinear conjugate gradient methods with modified secant condition. (English) Zbl 1056.90130
Summary: Conjugate gradient methods are appealing for large scale nonlinear optimization problems. Recently, expecting the fast convergence of the methods, Dai and Liao (2001) used secant condition of quasi-Newton methods. In this paper, we make use of modified secant condition given by Zhang et al. (1999) and Zhang and Xu (2001) and propose a new conjugate gradient method following to Dai and Liao (2001). It is new features that this method takes both available gradient and function value information and achieves a high-order accuracy in approximating the second-order curvature of the objective function. The method is shown to be globally convergent under some assumptions. Numerical results are reported.

MSC:
90C30 Nonlinear programming
90C52 Methods of reduced gradient type
Software:
minpack
PDF BibTeX XML Cite
Full Text: DOI