## A nonlinear conjugate gradient method with a strong global convergence property.(English)Zbl 0957.65061

The paper presents a new version of the conjugate gradient method $$x_{k+1}=x_k +\alpha_k d_k, d_{k+1}=-f''(x_{k+1})+ \beta_k d_k, \beta_k=\|f''(x_{k+1})\|^2/ d_k^T (f''(x_{k+1})-f''(x_k))$$ for solving unconstrained optimization problem $$\min_{x\in \mathbb R^n} f(x)$$. Unlike the classical stepsize rule $$\alpha_k =\text{argmin}\{ f(x_k+\alpha d_k): \alpha \geq 0 \}$$ the authors analyse a scheme in which $$\alpha_k$$ is chosen arbitrarily subject to the conditions $$f(x_k)-f(x_k+\alpha_k d_k) \geq -\delta \alpha_k f^{\prime T}(x_k) d_k$$ and $$f^{\prime T}(x_k+\alpha_k d_k) d_k > \sigma f^{\prime T}(x_k) d_k; 0<\delta <\sigma <1$$. The convergence $$f''(x_k) \to 0$$ is established provided that $$f(x)$$ is bounded below on $$N=\{x \in \mathbb R^n: f(x) \leq f(x_1)\}$$ and $$f''(x)$$ is Lipschitz continuous on $$N$$. The boundedness of $$N$$ is not assumed.

### MSC:

 65K05 Numerical mathematical programming methods 90C30 Nonlinear programming 90C53 Methods of quasi-Newton type

### Software:

CG_DESCENT; CUTEr; L-BFGS
Full Text: