zbMATH — the first resource for mathematics

Geometry Search for the term Geometry in any field. Queries are case-independent.
Funct* Wildcard queries are specified by * (e.g. functions, functorial, etc.). Otherwise the search is exact.
"Topological group" Phrases (multi-words) should be set in "straight quotation marks".
au: Bourbaki & ti: Algebra Search for author and title. The and-operator & is default and can be omitted.
Chebyshev | Tschebyscheff The or-operator | allows to search for Chebyshev or Tschebyscheff.
"Quasi* map*" py: 1989 The resulting documents have publication year 1989.
so: Eur* J* Mat* Soc* cc: 14 Search for publications in a particular source with a Mathematics Subject Classification code (cc) in 14.
"Partial diff* eq*" ! elliptic The not-operator ! eliminates all results containing the word elliptic.
dt: b & au: Hilbert The document type is set to books; alternatively: j for journal articles, a for book articles.
py: 2000-2015 cc: (94A | 11T) Number ranges are accepted. Terms can be grouped within (parentheses).
la: chinese Find documents in a given language. ISO 639-1 language codes can also be used.

a & b logic and
a | b logic or
!ab logic not
abc* right wildcard
"ab c" phrase
(ab c) parentheses
any anywhere an internal document identifier
au author, editor ai internal author identifier
ti title la language
so source ab review, abstract
py publication year rv reviewer
cc MSC code ut uncontrolled term
dt document type (j: journal article; b: book; a: book article)
On the limited memory BFGS method for large scale optimization. (English) Zbl 0696.90048
The authors investigate the numerical performance of several optimization algorithms for solving smooth, unconstrained and in particular, large problems. They compare their own code based on limited BFGS-updates with the method of A. Buckley and A. LeNir [ACM Trans. Math. Software 11, 103-119 (1985; Zbl 0562.65043)], which requires additional conjugate gradient steps, with two conjugate gradient methods and with the partitioned quasi-Newton method of A. Griewank and Ph. L. Toint [in: Nonlinear optimization, NATO Conf., Ser., Ser. II, 301-312 (1982; Zbl 0563.90085); Math. Program. 28, 25-49 (1984; Zbl 0561.65045) and Numer. Math. 39, 429-448 (1982; Zbl 0505.65018)]. The results are based on 16 test problems where the dimension varies between 50 and 1000. Conclusions of the numerical tests are that the method of the authors is faster than the method of Buckley and LeNir with respect to number of function evaluations and execution time. The method also outperforms the conjugate gradient methods and is competitive to the partitioned quasi- Newton method on dense problems, but inferior on partitioned problems. Moreover scaling effects are evaluated and a convergence analysis for convex problems is presented.
Reviewer: K.Schittkowski

90C30Nonlinear programming
90C06Large-scale problems (mathematical programming)
65K05Mathematical programming (numerical methods)
90C52Methods of reduced gradient type
ve08; tn; minpack; L-BFGS
[1]E.M.L. Beale, ”Algorithms for very large nonlinear optimization problems,” in: M.J.D. Powell, ed.,Nonlinear Optimization 1981 (Academic Press, London, 1981) pp. 281–292.
[2]A. Buckley, ”A combined conjugate gradient quasi-Newton minimization algorithm,”Mathematical Programming 15 (1978) 200–210. · Zbl 0386.90051 · doi:10.1007/BF01609018
[3]A. Buckley, ”Update to TOMS Algorithm 630,” Rapports Techniques No. 91, Institut National de Recherche en Informatique et en Automatique, Domaine Voluceau, Rocquencourt, B.P. 105 (Le Chesnay, 1987).
[4]A. Buckley and A. LeNir, ”QN-like variable storage conjugate gradients,”Mathematical Programming 27 (1983) 155–175. · Zbl 0519.65038 · doi:10.1007/BF02591943
[5]A. Buckley and A. LeNir, ”BBVSCG–A variable storage algorithm for function minimization,”ACM Transactions on Mathematical Software 11/2 (1985) 103–119. · Zbl 0562.65043 · doi:10.1145/214392.214395
[6]R.H. Byrd and J. Nocedal, ”A tool for the analysis of quasi-Newton methods with application to unconstrained minimization,”SIAM Journal on Numerical Analysis 26 (1989) 727–739. · Zbl 0676.65061 · doi:10.1137/0726042
[7]J.E. Dennis Jr. and R.B. Schnabel,Numerical methods for unconstrained optimization and nonlinear equations (Prentice-Hall, 1983).
[8]J.E. Dennis Jr. and R.B. Schnabel, ”A view of unconstrained optimization,” in: G.L. Nemhauser, A.H.G. Rinnooy Kan and M.J. Todd, eds.,Handbooks in Operations Research and Management Science, Vol. 1, Optimization (North-Holland, Amsterdam, 1989) pp. 1–72.
[9]R. Fletcher,Practical Methods of Optimization, Vol. 1, Unconstrained Optimization (Wiley, New York, 1980).
[10]J.C. Gilbert and C. Lemaréchal, ”Some numerical experiments with variable storage quasi-Newton algorithms,” IIASA Working Paper WP-88, A-2361 (Laxenburg, 1988).
[11]P.E. Gill and W. Murray, ”Conjugate-gradient methods for large-scale nonlinear optimization,” Technical Report SOL 79-15, Department of Operations Research, Stanford University (Stanford, CA, 1979).
[12]P.E. Gill, W. Murray and M.H. Wright,Practical Optimization (Academic Press, London, 1981).
[13]A. Griewank, ”The global convergence of partitioned BFGS on semi-smooth problems with convex decompositions,” ANL/MCS-TM-105, Mathematics and Computer Science Division, Argonne National Laboratory (Argonne, IL, 1987).
[14]A. Griewank and Ph.L. Toint, ”Partitioned variable metric updates for large structured optimization problems,”Numerische Mathematik 39 (1982a) 119–137. · Zbl 0482.65035 · doi:10.1007/BF01399316
[15]A. Griewank and Ph.L. Toint, ”Local convergence analysis of partitioned quasi-Newton updates,”Numerische Mathematik 39 (1982b) 429–448. · Zbl 0505.65018 · doi:10.1007/BF01407874
[16]A. Griewank and Ph.L. Toint, ”Numerical experiments with partially separable optimization problems,” in: D.F. Griffiths, ed.,Numerical Analysis: Proceedings Dundee 1983, Lecture Notes in Mathematics, Vol. 1066 (Springer, Berlin, 1984) pp. 203–220.
[17]D.C. Liu and J. Nocedal, ”Test results of two limited memory methods for large scale optimization,” Technical Report NAM 04, Department of Electrical Engineering and Computer Science, Northwestern University (Evanston, IL, 1988).
[18]J.J. Moré, B.S. Garbow and K.E. Hillstrom, ”Testing unconstrained optimization software,”ACM Transactions on Mathematical Software 7 (1981) 17–41. · Zbl 0454.65049 · doi:10.1145/355934.355936
[19]S.G. Nash, ”Preconditioning of truncated-Newton methods,”SIAM Journal on Scientific and Statistical Computing 6 (1985) 599–616. · Zbl 0592.65038 · doi:10.1137/0906042
[20]L. Nazareth, ”A relationship between the BFGS and conjugate gradient algorithms and its implications for new algorithms,”SIAM Journal on Numerical Analysis 16 (1979) 794–800. · Zbl 0424.65030 · doi:10.1137/0716059
[21]J. Nocedal, ”Updating quasi-Newton matrices with limited storage,”Mathematics of Computation 35 (1980) 773–782. · doi:10.1090/S0025-5718-1980-0572855-7
[22]D.P. O’Leary, ”A discrete Newton algorithm for minimizing a function of many variables,”Mathematical Programming 23 (1982) 20–33. · Zbl 0477.90055 · doi:10.1007/BF01583777
[23]J.D. Pearson, ”Variable metric methods of minimization,”Computer Journal 12 (1969) 171–178. · Zbl 0207.17301 · doi:10.1093/comjnl/12.2.171
[24]J.M. Perry, ”A class of conjugate gradient algorithms with a two-step variable-metric memory,” Discussion Paper 269, Center for Mathematical Studies in Economics and Management Science, Northwestern University (Evanston, IL, 1977).
[25]M.J.D. Powell, ”Some global convergence properties of a variable metric algorithm for minimization without exact line search,” in: R.W. Cottle and C.E. Lemke, eds.,Nonlinear Programing, SIAM-AMS Proceedings IX (SIAM, Philadelphia, PA, 1976).
[26]M.J.D. Powell, ”Restart procedures for the conjugate gradient method,”Mathematical Programming 12 (1977) 241–254. · Zbl 0396.90072 · doi:10.1007/BF01593790
[27]D.F. Shanno, ”On the convergence of a new conjugate gradient algorithm,”SIAM Journal on Numerical Analysis 15 (1978a) 1247–1257. · Zbl 0408.90071 · doi:10.1137/0715085
[28]D.F. Shanno, ”Conjugate gradient methods with inexact searches,”Mathematics of Operations Research 3 (1978b) 244–256. · Zbl 0399.90077 · doi:10.1287/moor.3.3.244
[29]D.F. Shanno and K.H. Phua, ”Matrix conditioning and nonlinear optimization,”Mathematical Programming 14 (1978) 149–160. · Zbl 0371.90109 · doi:10.1007/BF01588962
[30]D.F. Shanno and K.H. Phua, ”Remark on algorithm 500: minimization of unconstrained multivariate functions,”ACM Transactions on Mathematical Software 6 (1980) 618–622. · doi:10.1145/355921.355933
[31]T. Steihaug, ”The conjugate gradient method and trust regions in large scale optimization,”SIAM Journal on Numerical Analysis 20 (1983) 626–637. · Zbl 0518.65042 · doi:10.1137/0720042
[32]Ph.L. Toint, ”Some numerical results using a sparse matrix updating formula in unconstrained optimization,”Mathematics of Computation 32 (1978) 839–851. · doi:10.1090/S0025-5718-1978-0483452-7
[33]Ph.L. Toint, ”Towards an efficient sparsity exploiting Newton method for minimization,” in: I.S. Duff, ed.,Sparse Matrices and their Uses (Academic Press, New York, 1981) pp. 57–87.
[34]Ph.L. Toint, ”Test problems for partially separable optimization and results for the routine PSPMIN,” Report Nr 83/4, Department of Mathematics, Facultés Universitaires de Namur (Namur, 1983a).
[35]Ph.L. Toint, ”VE08AD, a routine for partially separable optimization with bounded variables,” Harwell Subroutine Library, A.E.R.E. (UK, 1983b).
[36]Ph.L. Toint, ”A view of nonlinear optimization in a large number of variables,” Report Nr 86/16, Department of Mathematics, Facultés Universitaires de Namur (Namur, 1986).