## A tool for the analysis of quasi-Newton methods with application to unconstrained minimization.(English)Zbl 0676.65061

The object of this paper is to present a very general result about the BFGS update formula and to introduce some tools that can be very useful in the convergence analysis of quasi-Newton methods of the form $$x_{k+1}=x_ k+\alpha_ kd_ k$$ for the unconstrained optimization problem min f(x), $$x\in R^ n$$ where $$\alpha_ k$$ is the steplength, and $$d_ k$$ is a search direction of the form $$d_ k=-B_ k^{-1}g_ k$$. $$g_ k$$ is the gradient of f at $$x_ k$$ and the matrix $$B_ k$$ is updated at every step by means of a quasi-Newton update formula: $$B_{k+1}=B_ k-(B_ ks_ ks^ T_ kB_ k)/(s^ T_ kB_ ks_ k)+(y_ ky^ T_ k)/(y^ T_ ks_ k)$$, where $$y_ k=g_{k+1}-g_ k$$ and $$s_ k=x_{k+1}-x_ k.$$
At first the authors show under very mild conditions a basic property of the BFGS update formula. The main result is the proof that the BFGS method with any of a wide variety of line searches is globally and superlinearly convergent on uniformly convex objective functions. Furthermore is shown that a backtracking line search satisfies the conditions for the given convergence results.
Reviewer: H.Benker

### MSC:

 65K05 Numerical mathematical programming methods 90C25 Convex programming
Full Text: