×

Several guaranteed descent conjugate gradient methods for unconstrained optimization. (English) Zbl 1406.65043

Summary: This paper investigates a general form of guaranteed descent conjugate gradient methods which satisfies the descent condition \(g^T_k d_k\leq -(1-1/(4\theta_k))\| g_k\|^2\) (\(\theta_k>1/4\)) and which is strongly convergent whenever the weak Wolfe line search is fulfilled. Moreover, we present several specific guaranteed descent conjugate gradient methods and give their numerical results for large-scale unconstrained optimization.

MSC:

65K05 Numerical mathematical programming methods
90C26 Nonconvex programming, global optimization
90C52 Methods of reduced gradient type