×

zbMATH — the first resource for mathematics

A double parameter self-scaling memoryless BFGS method for unconstrained optimization. (English) Zbl 1445.90101
Summary: A double parameter self-scaling memoryless BFGS method for unconstrained optimization is presented. In this method, the first two terms of the self-scaling memoryless BFGS matrix are scaled with a positive parameter, while the third one is scaled with another positive parameter. The first parameter scaling the first two terms is determined to cluster the eigenvalues of the memoryless BFGS matrix. The second parameter scaling the third term is computed as a preconditioner to the Hessian of the minimizing function combined with the minimization of the conjugacy condition to shift the large eigenvalues of the self-scaling memoryless BFGS matrix to the left. The stepsize is determined by the Wolfe line search conditions. The global convergence of this method is proved, assuming that the minimizing function is uniformly convex. The preliminary computational experiments on a set of 80 unconstrained optimization test functions show that this algorithm is more efficient and more robust than the self-scaling BFGS updates by Oren and Luenberger and by Oren and Spedicato. Subject to the CPU time metric, CG-DESCENT is a top performer. Comparisons with L-BFGS show that our algorithm is more efficient.

MSC:
90C30 Nonlinear programming
65K05 Numerical mathematical programming methods
Software:
L-BFGS; SCALCG; CUTE ; CUTEr
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Al-Baali, M., Global and superlinear convergence of a class of self-scaling methods with inexact line search, Comput Optim Appl, 9, 191-203 (1998) · Zbl 0904.90127
[2] Al-Baali, M., Numerical experience with a class of self-scaling quasi-Newton algorithms, J Optim Theory Appl, 96, 533-553 (1998) · Zbl 0907.90240
[3] Andrei, N., Accelerated conjugate gradient algorithm with finite difference Hessian/vector product approximation for unconstrained optimization, J Comput Appl Math, 230, 570-582 (2009) · Zbl 1170.65046
[4] Andrei, N., Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization, Optim Methods Softw, 32, 534-551 (2017) · Zbl 1368.49057
[5] Andrei, N., An adaptive scaled BFGS method for unconstrained optimization, Numer Algorithms, 77, 2, 413-432 (2018) · Zbl 1383.65059
[6] Andrei N (2018b) A set of unconstrained optimization test problems. Technical Report No. 4/May 2, 2018. Research Institute for Informatics, Bucharest, pp 1-10
[7] Bongartz, I.; Conn, AR; Gould, NIM; Toint, PL, CUTE: constrained and unconstrained testing environments, ACM TOMS, 21, 123-160 (1995) · Zbl 0886.65058
[8] Dolan, ED; MorĂ©, JJ, Benchmarking optimization software with performance profiles, Math Program, 91, 201-213 (2002) · Zbl 1049.90004
[9] Hager, WW; Zhang, H., A new conjugate gradient method with guaranteed descent and an efficient line search, SIAM J Optim, 16, 170-192 (2005) · Zbl 1093.90085
[10] Huang, HY, Unified approach to quadratically convergent algorithms for function minimization, J Optim Theory Appl, 6, 405-423 (1970) · Zbl 0184.20202
[11] Liu, DC; Nocedal, J., On the limited-memory BFGS method for large scale optimization, Math Program, 45, 503-528 (1989) · Zbl 0696.90048
[12] Nocedal, J., Updating quasi-Newton matrices with limited storage, Math Comput, 35, 773-782 (1980) · Zbl 0464.65037
[13] Nocedal, J., Theory of algorithms for unconstrained optimization, Acta Numer, 1, 199-242 (1992) · Zbl 0766.65051
[14] Nocedal, J.; Yuan, YX, Analysis of self-scaling quasi-Newton method, Math Program, 61, 19-37 (1993) · Zbl 0794.90067
[15] Oren, SS; Luenberger, DG, Self-scaling variable metric (SSVM) algorithms, part I: criteria and sufficient conditions for scaling a class of algorithms, Manag Sci, 20, 845-862 (1974) · Zbl 0316.90064
[16] Oren, SS; Spedicato, E., Optimal conditioning of self-scaling variable metric algorithms, Math Program, 10, 70-90 (1976) · Zbl 0342.90045
[17] Perry JM (1977) A class of conjugate gradient algorithms with a two-step variable-metric memory. Discussion Paper 269, Center for Mathematical Studies in Economics and Management Sciences, Northwestern University, Evanston
[18] Shanno, DF, Conjugate gradient methods with inexact searches, Math Oper Res, 3, 244-256 (1978) · Zbl 0399.90077
[19] Shanno, DF; Phua, KH, Matrix conditioning and nonlinear optimization, Math Program, 14, 149-160 (1978) · Zbl 0371.90109
[20] Sun, W.; Yuan, YX, Optimization theory and methods. Nonlinear programming (2006), New York: Springer Science + Business Media, New York
[21] Wolfe, P., Convergence conditions for ascent methods, SIAM Rev, 11, 226-235 (1969) · Zbl 0177.20603
[22] Wolfe, P., Convergence conditions for ascent methods. II: Some corrections, SIAM Rev, 13, 185-188 (1971) · Zbl 0216.26901
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.