Powell, M. J. D. Nonconvex minimization calculations and the conjugate gradient method. (English) Zbl 0531.65035 Numerical analysis, Proc. 10th bienn. Conf., Dundee/Scotl. 1983, Lect. Notes Math. 1066, 122-141 (1984). Summary: [For the entire collection see Zbl 0527.00028.] We consider the global convergence of conjugate gradient methods without restarts, assuming exact arithmetic and exact line searches, when the objective function is twice continuously differentiable and has bounded level sets. Most of our attention is given to the Polak-Ribière algorithm, and unfortunately we find examples that show that the calculated gradients can remain bounded away from zero. The examples that have only two variables show also that some variable metric algorithms for unconstrained optimization need not converge. However, a global convergence theorem is proved for the Fletcher-Reeves version of the conjugate gradient method. Cited in 4 ReviewsCited in 86 Documents MSC: 65K05 Numerical mathematical programming methods 90C30 Nonlinear programming Keywords:global convergence; conjugate gradient methods; exact line searches; Polak-Ribière algorithm; variable metric algorithms; Fletcher-Reeves Citations:Zbl 0527.00028 PDF BibTeX XML OpenURL