×

Convergence of descent method without line search. (English) Zbl 1084.65064

Descent methods of inexact line search type, \(x_{k+1}:= x_k+ \alpha_k d_k\), are considered, where \(d_k\) is the descent direction and \(\alpha_k\) the step size coefficient. No one-dimensional subproblem need to be solved, but explicit formulas for \(\alpha_k\) are presented. Two methods are discussed. The first is applicable if the gradient, \(g\) of the objective \(f\), is Lipschitz and if the Lipschitz constant, \(L\), is easy to estimate. Then the coefficient has the form \(\alpha_k= -g(x_k)^T d_k/(L_k\| d_k\|^2)\) where \(L_k\) are approximations of \(L\) that play a role for the convergence proofs.
The second method is applicable if the second derivative of \(f\) is bounded by some constant, say \(M\), and if \(M\) is easy to estimate. In this case, \(\alpha_k= -g(x_k)^T d_k/(M_k\| d_k\|^2)\) where \(M_k\) are approximations of \(M\) that play a role for the convergence proofs.
Global convergence as well as linear convergence order of both methods is shown.

MSC:

65K05 Numerical mathematical programming methods
90C30 Nonlinear programming
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Cohen, A. I., Stepsize analysis for descent methods, J. Optim. Theor. Appl., 33, 2, 187-205 (1981) · Zbl 0421.49030
[2] Barzilai, J.; Borwein, J. M., Two-point step size gradient methods, IMA J. Numer. Anal., 8, 141-148 (1988) · Zbl 0638.65055
[3] Dai, Y. H.; Liao, L. Z., R-linear convergence of the Barzilai and Borwein gradient method, IMA J. Numer. Anal., 22, 1-10 (2002) · Zbl 1002.65069
[4] Dai, Y. H.; Yuan, Y., Convergence properties of Beale-Powell restart algorithm, Sci. China, 41, 11, 1142-1150 (1998) · Zbl 0919.65042
[5] Chen, X. D.; Sun, J., Global convergence of a two-parameter family of conjugate gradient methods without line search, J. Computat. Appl. Math., 146, 37-45 (2002) · Zbl 1018.65081
[6] Dixon, L. C.W., Conjugate directions without line searches, J. Inst. Math. Appl., 11, 317-328 (1973) · Zbl 0259.65060
[8] Gilbert, J. C.; Nocedal, J., Global convergence properties of conjugate gradient methods for optimization, SIAM J. Optim., 2, 1, 21-42 (1992) · Zbl 0767.90082
[9] Grippo, L.; Lucidi, S., A globally convergent version of the Polak-Ribière conjugate gradient method, Math. Prog., 78, 375-391 (1997) · Zbl 0887.90157
[10] Grippo, L.; Lampariello, F.; Lucidi, S., A class of nonmonotone stability methods in unconstrained optimization, Numer. Math., 62, 779-805 (1991) · Zbl 0724.90060
[11] Huang, H. Y.; Chambliss, J. P., Quadratically convergent algorithms and one-dimensional search schemes, J. Optim. Theor. Appl., 11, 2, 175-188 (1973) · Zbl 0238.65033
[12] Nocedal, J.; Stephen, J. W., Numer. Optim. (1999), Springer-Verlag: Springer-Verlag New York, Inc.
[13] Raydan, M., The Barzilai and Borwein method for the large scale unconstrained minimization problem, SIAM J. Optim., 7, 26-33 (1997) · Zbl 0898.90119
[14] Raydan, M., On the Barzilai-Borwein gradient choice of steplength for the gradient method, IMA J. Numer. Anal., 13, 321-326 (1993) · Zbl 0778.65045
[15] Sun, J.; Zhang, J. P., Global convergence of conjugate gradient methods without line search, Ann. Operat. Res., 103, 161-173 (2001) · Zbl 1014.90071
[17] Shi, Z. J., Restricted PR conjugate gradient method and its global convergence, Adv. Math., 31, 1, 47-55 (2002) · Zbl 1264.90179
[18] Vrahatis, M. N.; Androulakis, G. S.; Lambrinos, J. N.; Magoulas, G. D., A class of gradient unconstrained minimization algorithms with adaptive stepsize, J. Comput. Appl. Math., 114, 367-386 (2000) · Zbl 0958.65072
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.