## Hybrid conjugate gradient algorithm for unconstrained optimization.(English)Zbl 1168.90017

Summary: A new hybrid conjugate gradient algorithm is proposed and analyzed. The parameter $$\beta _{k }$$ is computed as a convex combination of the Polak-Ribière-Polyak and the Dai-Yuan conjugate gradient algorithms, i.e. $$\beta^N_k =(1 - \theta _{k })\beta^PRP_k +\theta _{k } \beta^DY_k$$. The parameter $$\theta _{k }$$ in the convex combination is computed in such a way that the conjugacy condition is satisfied, independently of the line search. The line search uses the standard Wolfe conditions. The algorithm generates descent directions and when the iterates jam the directions satisfy the sufficient descent condition. Numerical comparisons with conjugate gradient algorithms using a set of 750 unconstrained optimization problems, some of them from the CUTE library, show that this hybrid computational scheme outperforms the known hybrid conjugate gradient algorithms.

### MSC:

 90C52 Methods of reduced gradient type 90C30 Nonlinear programming

### Software:

SCALCG; Algorithm 500; CUTE; CUTEr
Full Text: