×

Accelerated gradient descent methods with line search. (English) Zbl 1198.65104

The authors consider the unconstrained minimization problem on \(\mathbb{R}^n\) with a twice differentiable objective function \(f: \mathbb{R}^n\to\mathbb{R}\). A class of gradient descent methods is based on the multiplication of the iteration step size by an appropriate accleration parameter. The step-size is computed by a line search procedure. The methods of this class are called accelerated gradient descent algorithms with the line search.
In the further part of the paper, a special accelerated gradient descent method arising from the Newton method with the line search is proposed. The acceleration parameter of this method is obtained by replacing the Hessian by an appropriately generated diagonal matrix. Linear convergence of the proposed algorithm is proved for uniformly convex objective functions satisfying some additional conditions. Reported numerical results show that the proposed method produces better results than alternative methods known from the literature.

MSC:

65K05 Numerical mathematical programming methods
PDF BibTeX XML Cite
Full Text: DOI

References:

[1] Andrei, N.: An acceleration of gradient descent algorithm with backtracking for unconstrained optimization. Numer. Algor. 42, 63–73 (2006) · Zbl 1101.65058
[2] Andrei, N.: An unconstrained optimization test functions collection. http://www.ici.ro/camo/journal/vol10/v10a10.pdf · Zbl 1161.90486
[3] Armijo, L.: Minimization of functions having Lipschitz first partial derivatives. Pac. J. Math 6, 1–3 (1966) · Zbl 0202.46105
[4] Barzilai, J., Borwein, J.M.: Two point step size gradient method. IMA J. Numer. Anal. 8, 141–148 (1988) · Zbl 0638.65055
[5] Dai, Y.H.: Alternate step gradient method. Report AMSS–2001–041, Academy of Mathematics and Systems Sciences, Chinese Academy of Sciences (2001) · Zbl 1056.65055
[6] Dai, Y.H., Fletcher, R.: On the asymptotic behaviour of some new gradient methods. Numerical Analysis Report, NA/212, Dept. of Math. University of Dundee, Scotland, UK (2003) · Zbl 1099.90038
[7] Dai, Y.H., Yuan, J.Y., Yuan, Y.: Modified two-point step-size gradient methods for unconstrained optimization. Comput. Optim. Appl. 22, 103–109 (2002) · Zbl 1008.90056
[8] Dai, Y.H., Yuan, Y.: Alternate minimization gradient method. IMA J. Numer. Anal. 23, 377–393 (2003) · Zbl 1055.65073
[9] Dai, Y.H., Yuan, Y.: Analysis of monotone gradient methods. J. Ind. Manage. Optim. 1, 181–192 (2005) · Zbl 1071.65084
[10] Dai, Y.H., Zhang, H.: An adaptive two-point step-size gradient method. Research report, Institute of Computational Mathematics and Scientific/Engineering Computing, Chinese Academy of Sciences (2001)
[11] Dai, Y.H., Liao, L.Z.: R-linear convergence of the Barzilai and Borwein gradient method. IMA J. Numer. Anal. 22, 1–10 (2002) · Zbl 1002.65069
[12] Fletcher, R., Reeves, C.M.: Function minimization by conjugate gradients. Comput. J. 7, 149–154 (1964) · Zbl 0132.11701
[13] Friedlander, A., Martinez, J.M., Molina, B., Raydan, M.: Gradient method with retards and generalizations. SIAM J. Numer. Anal. 36, 275–289 (1999) · Zbl 0940.65032
[14] Goldstein, A.A.: On steepest descent. SIAM J. Control 3, 147–151 (1965) · Zbl 0221.65094
[15] Lemaréchal, C.: A view of line search. In: Auslander, A., Oetti, W., Stoer J. (eds.) Optimization and Optimal Control, pp. 59–78. Springer, Berlin (1981)
[16] Luenberg, D.G., and Ye, Y.: Linear and nonlinear programming. Springer Science + Business Media, LLC, New York (2008)
[17] Molina, B., Raydan, M.: Preconditioned Barzilai–Borwein method for the numerical solution of partial differential equations. Numer. Algor. 13, 45–60 (1996) · Zbl 0861.65025
[18] Moré, J.J., Thuente, D.J.: On line search algorithm with guaranteed sufficient decrease. Mathematics and Computer Science Division Preprint MCS-P153-0590, Argone National Laboratory, Argone (1990)
[19] Ortega, J.M., Rheinboldt, W.C.: Iterative Solution Of Nonlinear Equation in Several Variables. Academic, London (1970) · Zbl 0241.65046
[20] Polak, E., Ribiére, G.: Note sur la convergence de méthodes de directions conjuguées. Revue Francaise Informat. Reserche Opérationnelle, 3e Année 16, 35–43 (1969)
[21] Polyak, B.T.: The conjugate gradient method in extreme problems. USSR Comp. Math. Math. Phys. 9, 94–112 (1969) · Zbl 0229.49023
[22] Potra, F.A., Shi, Y.: Efficient line search algorithm for unconstrained optimization. J. Optim. Theory Appl. 85, 677–704 (1995) · Zbl 0831.90106
[23] Powell, M.J.D.: Some Global Convergence Properties of a Variable-Metric Algorithm For Minimization Without Exact Line Search, vol. 9, pp. 53–72. AIAM-AMS Proc., Philadelphia (1976) · Zbl 0338.65038
[24] Raydan, M.: On the Barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. Anal. 13, 321–326 (1993) · Zbl 0778.65045
[25] Raydan, M.: The Barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26–33 (1997) · Zbl 0898.90119
[26] Rockafellar, R.T.: Convex Analysis. Princeton University Press, Princeton (1970) · Zbl 0193.18401
[27] Shi, Z.-J.: Convergence of line search methods for unconstrained optimization. Appl. Math. Comput. 157, 393–405 (2004) · Zbl 1072.65087
[28] Vrahatis, M.N., Androulakis, G.S., Lambrinos, J.N., Magoulas, G.D.: A class of gradient unconstrained minimization algorithms with adaptive step-size. J. Comp. and Appl. Math. 114, 367–386 (2000) · Zbl 0958.65072
[29] Yuan, Y.: A new stepsize for the steepest descent method. Research report, Institute of Computational Mathematics and Scientific/Engineering Computing, Academy of Mathematics and Systems Sciences, Chinese Academy of Sciences (2004)
[30] Sun, W., Yuan, Y.-X.: Optimization Theory and Methods: Nonlinear Programming. Springer, New York (2006) · Zbl 1129.90002
[31] Wolfe, P.: Convergence conditions for ascent methods. SIAM Rev. 11, 226–235 (1968) · Zbl 0177.20603
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.