## Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization.(English)Zbl 1170.65046

A new algoritm for acceleration of the conjugate gradient method is developed by computing the parameter $$\beta_k$$ using a finite difference approximation of the Hessian/vector product. The search direction is also computed using the forward difference approximation of the Hessian/vector product. In contrast to the Newton and quasi Newton methods, for the conjugate gradient method the step lengths may differ from 1 depending how the problem is scaled. The authors suggests a modification of the step length which reduces number of function evaluations compared to the currently available conjugate gradient algorithms.
It is proved that the method is globally convergent and its convergence rate is linear for uniformly convex functions; but reduction of function values is significantly improved. The performance of the suggested method is demonstrated for unconstrained 750 large scale test problems by comparing it with the conjugate algorithms like CONMIN, SCALCG and the truncated Newton methods.

### MSC:

 65K05 Numerical mathematical programming methods 90C06 Large-scale problems in mathematical programming 90C30 Nonlinear programming

### Software:

CUTEr; CUTE ; SCALCG; Algorithm 500; L-BFGS; tn; CONMIN
Full Text: