zbMATH — the first resource for mathematics

Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
A new two-step gradient-type method for large-scale unconstrained optimization. (English) Zbl 1198.90395
Summary: We propose some improvements on a new gradient-type method for solving large-scale unconstrained optimization problems, in which we use data from two previous steps to revise the current approximate Hessian. The new method which we considered, resembles to that of Barzilai and Borwein (BB) method. The innovation features of this approach consist in using approximation of the Hessian in diagonal matrix form based on the modified weak secant equation rather than the multiple of the identity matrix in the BB method. Using this approach, we can obtain a higher order accuracy of Hessian approximation when compares to other existing BB-type method. By incorporating a simple monotone strategy, the global convergence of the new method is achieved. Practical insights into the effectiveness of the proposed method are given by numerical comparison with the BB method and its variant.
90C52Methods of reduced gradient type
65K10Optimization techniques (numerical methods)
[1]Barzilai, J.; Borwein, J. M.: Two point step size gradient methods, IMA J. Numer. anal. 8, 141-148 (1988) · Zbl 0638.65055 · doi:10.1093/imanum/8.1.141
[2]Akaike, H.: On a successive transformation of probability distribution and its application to the analysis of the optimum gradient method, Ann. inst. Statist. math. 11, 1-17 (1959) · Zbl 0100.14002 · doi:10.1007/BF01831719
[3]Dai, Y. H.; Liao, L. Z.: R-linear convergence of the barzilai and Borwein gradient method, IMA J. Numer. anal. 22, 1-10 (2002) · Zbl 1002.65069 · doi:10.1093/imanum/22.1.1
[4]Bin, Z.; Gao, L.; Dai, Y. H.: Monotone projected gradient methods for large-scale box-constrained quadratic programming, Sci. China ser. A 49, 688-702 (2006) · Zbl 1112.90056 · doi:10.1007/s11425-006-0688-2
[5]Dai, Y. H.; Fletcher, R.: Projected barzilai–Borwein methods for large-scale box-constrained quadratic programming, Numer. math. 100, 21-47 (2005) · Zbl 1068.65073 · doi:10.1007/s00211-004-0569-y
[6]Raydan, M.: On the barzilai and Borwein choice of steplength for the gradient method, IMA J. Numer. anal. 13, 618-622 (1993) · Zbl 0778.65045 · doi:10.1093/imanum/13.3.321
[7]Hassan, M. A.; Leong, W. J.; Farid, M.: A new gradient method via quasi-Cauchy relation which guarantees descent, J. comput. Appl. math. 230, 300-305 (2009) · Zbl 1179.65067 · doi:10.1016/j.cam.2008.11.013
[8]Leong, W. J.; Hassan, M. A.; Farid, M.: A monotone gradient method via weak secant equation for unconstrained optimization, Taiwanese J. Math. 14, No. 2, 413-423 (2010) · Zbl 1203.90148
[9]Ford, J. A.: Implicit updates in multi-step quasi-Newton methods, Comput. math. Appl. 42, 1083-1091 (2001) · Zbl 0989.65063 · doi:10.1016/S0898-1221(01)00223-1
[10]Ford, J. A.; Moghrabi, L. A.: Multi-step quasi-Newton methods for optimization, J. comput. Appl. math. 50, 305-323 (1994) · Zbl 0807.65062 · doi:10.1016/0377-0427(94)90309-3
[11]Ford, J. A.; Moghrabi, L. A.: Alternating multi-step quasi-Newton methods for unconstrained optimization, J. comput. Appl. math. 82, 105-116 (1997) · Zbl 0886.65064 · doi:10.1016/S0377-0427(97)00075-7
[12]Ford, J. A.; Thrmlikit, S.: New implicite updates in multi-step quasi-Newton methods for unconstrained optimization, J. comput. Appl. math. 152, 133-146 (2003) · Zbl 1025.65035 · doi:10.1016/S0377-0427(02)00701-X
[13]Andrei, N.: An unconstrained optimization test functions collection, Adv. model. Optim. 10, 147-161 (2008) · Zbl 1161.90486 · doi:http://www.ici.ro/camo/journal/v10n1.htm
[14]Moré, J. J.; Garbow, B. S.; Hillstorm, K. E.: Testing unconstrained optimization software, ACM trans. Math. softw. 7, 17-41 (1981) · Zbl 0454.65049 · doi:10.1145/355934.355936
[15]Dolan, E. D.; Moré, J. J.: Benchmarking optimization software with perpormance profiles, Math. program. 91, 201-213 (2002) · Zbl 1049.90004 · doi:10.1007/s101070100263
[16]Dai, Y. H.; Yuan, J. Y.; Yuan, Y.: Modified two-point stepsize gradient methods for unconstrained optimization, Comput. optim. Appl. 22, 103-109 (2002) · Zbl 1008.90056 · doi:10.1023/A:1014838419611
[17]Dai, Y. H.; Yuan, Y.: Alternative minimization gradient method, IMA J. Numer. anal. 23, 373-393 (2003) · Zbl 1055.65073 · doi:10.1093/imanum/23.3.377