An only 2-step \(Q\)-superlinear convergence example for some algorithms that use reduced Hessian approximations. (English) Zbl 0565.90060

It is shown by example that the reduced Hessian method for constrained optimization that is known to give 2-step \(Q\)-superlinear convergence may not converge \(Q\)-superlinearly.
Reviewer: Yaxiang Yuan


90C30 Nonlinear programming
49M37 Numerical methods based on nonlinear programming
65K05 Numerical mathematical programming methods
Full Text: DOI


[1] R.H. Byrd, ”An example of irregular convergence in some constrained optimization methods that use the projected Hessian”,Mathematical Programming, this issue. · Zbl 0576.90079
[2] J. Goodman, ”Newton’s method for constrained optimization”, Courant Institute of Mathematical Sciences, New York University, New York (1982). · Zbl 0589.90065
[3] J. Nocedal and M. Overton, ”Projected Hessian updating algorithms for nonlinear constrained optimization” Report 59, Computer Science Department, New York University, New York (1983). · Zbl 0593.65043
[4] M. Overton, personal communication, 1984.
[5] M.J.D. Powell, ”The convergence of variable metric methods for nonlinearly constrained optimization calculation” in: O.L. Mangasarian, R. Meyer and S. Robinson, eds.,Nonlinear Programming 3 (Academic Press, New York and London, 1978). · Zbl 0464.65042
[6] J. Stoer, ”Foundations of recursive quadratic programming methods for solving nonlinear programs”, Institut für Angewandte Mathematik und Statitik, Universität Würzburg, West Germany, presented at the NATO ASI on Computational Mathematical Programming, Bad Windsheim, West Germany (1984).
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.