This paper provides several new properties of the nonlinear conjugate gradient method in by Y. H. Dai
and Y. Yuan
[SIAM J. Optim. 10, No. 1, 177-182 (1999; Zbl 0957.65061
)]. Firstly , the method is proved to have a certain self adjusting property that is independent of the line search and the function convexity . Secondly, under mild assumptions on the objective function, the method is shown to be globally convergent with a variety of fine searches. Thirdly, the author finds that instead of the negative gradient direction, the search direction defined by the nonlinear conjugate gradient method of Dai and Yuan [loc. cit.] can be used to restart any optimization method while guaranteeing the global convergence of the method. Some numerical results are also presented.