## Global convergence result for conjugate gradient methods.(English)Zbl 0794.90063

Summary: Conjugate gradient optimization algorithms depend on the search directions, $$s^{(1)}= -g^{(1)}$$, $$s^{(k+1)}=- g^{(k+1)}+ \beta^{(k)}s^{(k)}$$, $$k\geq 1$$, with different methods arising from different choices for the scalar $$\beta^{(k)}$$. In this note, conditions are given on $$\beta^{(k)}$$ to ensure global convergence of the resulting algorithms.

### MSC:

 90C30 Nonlinear programming