The authors study the conjugate gradient method for solving large-scale nonlinear optimization problems. In the first two sections, the authors present the necessary background relating to conjugate methods in general and choices for the conjugacy condition. The third section contains the main contribution of this paper which is a new conjugacy condition derived by the authors using a new quasi-Newton equation. This equation uses not only the gradient value information but also the information relating to the function value.
Several theorems are then presented, with proof, which include the properties of the proposed conjugacy condition and a study of the properties (e.g., convergence) of the derived algorithm. The paper concludes with a section containing the results of the performed numerical experimentation and a list of relevant references.