×

A nonlinear conjugate gradient algorithm with an optimal property and an improved Wolfe line search. (English) Zbl 1266.49065

Summary: In this paper, we seek the conjugate gradient direction closest to the direction of the scaled memoryless BFGS method and propose a family of conjugate gradient methods for unconstrained optimization. An improved Wolfe line search is also proposed, which can avoid a numerical drawback of the original Wolfe line search and guarantee the global convergence of the conjugate gradient method under mild conditions. To accelerate the algorithm, we introduce adaptive restarts along negative gradients based on the extent to which the function approximates some quadratic function during previous iterations. Numerical experiments with the CUTEr collection show that the proposed algorithm is promising.

MSC:

49M37 Numerical methods based on nonlinear programming
90C30 Nonlinear programming

Software:

minpack; CUTEr
PDFBibTeX XMLCite
Full Text: DOI