## Acceleration of conjugate gradient algorithms for unconstrained optimization.(English)Zbl 1172.65027

A new approach for the acceleration of conjugate gradient methods is presented. The proposed method is based on the modification of the steplength $$\alpha _k$$, computed by Wolfe line search conditions, by means of a positive parameter $$\eta _k$$, for the improvement of the behavior of the classical conjugate gradient algorithms which are mainly applied to large-scale unconstrained optimization. It is shown that for uniformly convex functions the convergence of the accelerated algorithm is still linear, but the reduction in function values is significantly improved. Numerical comparisons with conjugate gradient algorithms, using a set of 750 unconstrained optimization problems, show that the accelerated computational scheme outperform the corresponding conjugate gradient algorithms.

### MSC:

 65K05 Numerical mathematical programming methods 90C30 Nonlinear programming

### Software:

CUTEr; Algorithm 500; SCALCG; CUTE ; L-BFGS
Full Text: