zbMATH — the first resource for mathematics

Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
Multivariate spectral gradient method for unconstrained optimization. (English) Zbl 1155.65046
The authors present the multivariate spectral gradient (MSG) method for solving unconstrained optimization problems. Combined with some quasi-Newton property the MSG method allows an individual adaptive stepsize along each coordinate direction, which guarantees that the method is finitely convergent for positive definite quadratics. Especially, it converges no more than two steps for positive definite quadratics with diagonal Hessian, and quadratically for objective functions with positive definite diagonal Hessian. Moreover, based on a nonmonotone line search, global convergence is established for the MSG algorithm. Also a numerical study of the MSG algorithm compared with the global Barzilai-Borwein (GBB) algorithm is given. The search direction of the MSG method is close to that presented in the paper by {\it M. N. Vrahatis, G. S. Androulakis, J. N. Lambrinos} and {\it G. D. Magoulas} [J. Comput. App. Math. 114, 367--386 (2000; Zbl 0958.65072)], but the explanation for the steplength selection is different. The stepsize in this method is selected from the estimates of the eigenvalues of the Hessian but not a local estimation of the Lipschitz constant in the above mentioned paper. At last numerical results are reported, which show that this method is promising and deserves futher discussing.

65K05Mathematical programming (numerical methods)
90C30Nonlinear programming
90C53Methods of quasi-Newton type
Full Text: DOI
[1] Barzilai, J.; Borwein, J. M.: Two point step size gradient methods. IMA J. Numer. anal. 8, 141-148 (1998) · Zbl 0638.65055
[2] Raydan, M.: On the barzilai and Borwein choice of steplength for the gradient method. IMA J. Numer. anal. 13, 321-326 (1993) · Zbl 0778.65045
[3] Raydan, M.: The barzilai and Borwein gradient method for the large scale unconstrained minimization problem. SIAM J. Optim. 7, 26-33 (1997) · Zbl 0898.90119
[4] Grippo, L.; Lampariello, F.; Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. anal. 23, 707-716 (1986) · Zbl 0616.65067
[5] R. Fletcher, On the Barzilai -- Borwein method, Numerical Analysis Report NA/207, October 2001. · Zbl 1118.90318
[6] Dai, Y. H.; Liao, L. Z.: R-linear convergence of the barzilai and Borwein gradient method. IMA J. Numer. anal. 22, 1-10 (2002) · Zbl 1002.65069
[7] Molina, B.; Raydan, M.: Preconditioned barzilai -- Borwein method for the numerical solution of partial differential equations. Numer. algor. 13, 45-60 (1996) · Zbl 0861.65025
[8] Dai, Y. H.; Yuan, J. Y.; Yuan, Y. X.: Modified two-point stepsize gradient methods for unconstrained optimization. Comput. optim. Appl. 22, 103-109 (2002) · Zbl 1008.90056
[9] Luengo, F.; Raydan, M.; Glunt, W.; Hayden, T. L.: Preconditioned spectral gradient method. Numer. algor. 30, 241-258 (2002) · Zbl 1027.90110
[10] Dai, Y. H.; Zhang, H. C.: Adaptive two-point stepsize gradient algorithm. Numer. algor. 27, 377-385 (2001) · Zbl 0992.65063
[11] Grippo, L.; Sciandrone, M.: Nonmonotone globalization techniques for the barzilai -- Borwein gradient method. Comput. optim. Appl. 23, 143-169 (2002) · Zbl 1028.90061
[12] Vrahatis, M. N.; Androulakis, G. S.; Lambrinos, J. N.; Magoulas, G. D.: A class of gradient unconstrained minimization algorithms with adaptive stepsize. J. comput. Appl. math. 114, 367-386 (2000) · Zbl 0958.65072
[13] Zhang, H. C.; Hager, W. W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14, 1043-1056 (2004) · Zbl 1073.90024
[14] Dai, Y. H.: On the nonmonotone line search. J. optim. Theory appl. 112, 315-330 (2002) · Zbl 1049.90087
[15] Shi, Z. J.; Shen, J.: Convergence of nonmonotone line search method. J. comput. Appl. math. 193, 397-412 (2006) · Zbl 1136.90477
[16] Shi, Z. J.: Convergence of line search methods for unconstrained optimization. Appl. math. Comput. 157, 393-405 (2004) · Zbl 1072.65087
[17] Sun, W.; Han, J.; Sun, J.: Global convergence of nonmonotone descent methods for unconstrained optimization problems. J. comput. Appl. math. 146, 89-98 (2002) · Zbl 1007.65044
[18] Sun, W. Y.: Nonmonotone trust region method for solving optimization problems. Appl. math. Comput. 156, 159-174 (2004) · Zbl 1059.65055