×

An efficient hybrid conjugate gradient method for unconstrained optimization. (English) Zbl 1007.90065

Summary: Recently, we propose a nonlinear conjugate gradient method, which produces a descent search direction at every iteration and converges globally provided that the line search satisfies the weak Wolfe conditions. In this paper, we will study methods related to the new nonlinear conjugate gradient method. Specifically, if the size of the scalar \(\beta_k\) with respect to the one in the new method belongs to some interval, then the corresponding methods are proved to be globally convergent; otherwise, we are able to construct a convex quadratic example showing that the methods need not converge. Numerical experiments are made for two combinations of the new method and the Hestenes-Stiefel conjugate gradient method. The initial results show that, one of the hybrid methods is especially efficient for the given test problems.

MSC:

90C30 Nonlinear programming
49M37 Numerical methods based on nonlinear programming
65K05 Numerical mathematical programming methods
PDFBibTeX XMLCite
Full Text: DOI