Global convergence result for conjugate gradient methods. (English) Zbl 0794.90063

Summary: Conjugate gradient optimization algorithms depend on the search directions, \(s^{(1)}= -g^{(1)}\), \(s^{(k+1)}=- g^{(k+1)}+ \beta^{(k)}s^{(k)}\), \(k\geq 1\), with different methods arising from different choices for the scalar \(\beta^{(k)}\). In this note, conditions are given on \(\beta^{(k)}\) to ensure global convergence of the resulting algorithms.


90C30 Nonlinear programming
Full Text: DOI


[1] Al-Baali, M.,Descent Property and Global Convergence of the Fletcher-Reeves Method with Inexact Line Searches, IMA Journal of Numerical Analysis, Vol. 5, No. 1, pp. 121-124, 1985. · Zbl 0578.65063
[2] Powell, M. J. D.,Nonconvex Minimization Calculations and the Conjugate Gradient Method, Report No. DAMTP 1983/NA14, Department of Applied Mathematics and Theoretical Physics, University of Cambridge, Cambridge, England, 1983. · Zbl 0531.65035
[3] Touati-Ahmed, D., andStorey, C.,Globally Convergent Hybrid Conjugate Gradient Methods, Journal of Optimization Theory and Applications, Vol. 64, No. 2, pp. 379-397, 1990. · Zbl 0687.90081
[4] Gilbert, J. C., andNocedal, J.,Global Convergence Properties of Conjugate Gradient Methods for Optimization, Rapport de Recherche No. 1268, Institut National de Recherche en Informatique et Automatique, Domaine de Voluceau, Rocquencourt, Le Chesnay, France, 1990.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.