Global convergence properties of conjugate gradient methods for optimization. (English) Zbl 0767.90082

This paper studies the convergence properties of nonlinear conjugate gradient methods without restarts, and with practical line searches for the problem \(min_{x\in\mathbb{R}^ n}f(x)\). Iterations of the search directions and new points under study are chosen as: \[ x_{k+1}=x_ k+\alpha_ kd_ k,\text{ where } d_ k=\begin{cases} -g_ k, & \text{ for } k=1,\\ -g_ k+\beta_ kd_{k-1} & \text{ for } k\geq 2, \end{cases} \] Various choices of \(\beta_ k\) and inexact line searches that result in global convergence are considered. The analysis is closely related to the methods of Fletcher-Reeves and Polak-Ribière. Numerical experiments are presented.


90C52 Methods of reduced gradient type
90C30 Nonlinear programming
90-08 Computational methods for problems pertaining to operations research and mathematical programming
65K05 Numerical mathematical programming methods
Full Text: DOI Link