zbMATH — the first resource for mathematics

Subspace methods for large scale nonlinear equations and nonlinear least squares. (English) Zbl 1171.65040
Summary: We study large scale systems nonlinear of equations and nonlinear least square problems. We present subspace methods for solving these two special optimization problems. The subspace methods have the characteristic to force the next iteration in a low dimensional subspace. The main technique is to construct subproblems in low dimensions so that the computation cost in each iteration can be reduced comparing to standard approaches. The subspace approach offers a possible way to handle large scale optimization problems which are now attracting more and more attention. Actually, quite a few known techniques can be viewed as subspace methods, such as conjugate gradient method, limited memory quasi-Newton method, projected gradient method, and null space method.

65H10 Numerical computation of solutions to systems of equations
65K05 Numerical mathematical programming methods
90C20 Quadratic programming
90C51 Interior-point methods
90C53 Methods of quasi-Newton type
Filtrane; KELLEY; L-BFGS
Full Text: DOI
[1] Bouaricha A, Moré JJ (1997) Impact of partial separability on large-scale optimization. Comput Optim Appl 7(1):27–40 · Zbl 0883.90107 · doi:10.1023/A:1008628114432
[2] Conn AR, Gould NJM, Toint PhL (2000) Trust-region methods. MPS-SIAM series on optimization. SIAM, Philadelphia
[3] Dennis JE, Schnable RB (1993) Numerical methods for unconstrained optimization and nonlinear equations. SIAM, Philadelphia
[4] Fletcher R (1987) Practical methods of optimization, 2nd edn. Wiley, New York · Zbl 0905.65002
[5] Gill PE, Leonard MW (2001) Reduced-Hessian quasi-Newton methods for unconstrained optimization. SIAM J Optim 12:209–237 · Zbl 1003.65068 · doi:10.1137/S1052623400307950
[6] Golub GH, Van Loan ChF (1996) Matrix computations, 3rd edn. Johns Hopkins University Press, Baltimore
[7] Gould NIM, Toint PhL (2007) FILTRANE, a Fortran 95 filter-trust-region package for solving nonlinear least-squares and nonlinear feasibility problems. ACM Trans Math Softw 33(1):3–25 · Zbl 05458460 · doi:10.1145/1206040.1206043
[8] Gould NIM, Orban D, Toint PhL (2005) Numerical methods for large-scale nonlinear optimization. Acta Numer 299–361 · Zbl 1119.65337
[9] Kelly CT (1995) Iterative methods for linear and nonlinear equations. SIAM, Philadelphia
[10] Liu DC, Nocedal J (1989) On the limited memory BFGS method for large scale optimization. Math Program 45:503–528 · Zbl 0696.90048 · doi:10.1007/BF01589116
[11] Mizutani E, Demmel JW (2003) On structure-exploiting trust-region regularized nonlinear least squares algorithms for neural-network learning. Neural Netw 16(5–6):745–753 · Zbl 02022348 · doi:10.1016/S0893-6080(03)00085-6
[12] Nocedal J, Wright SJ (1999) Numerical optimization. Springer, New York · Zbl 0930.65067
[13] Ortega JM, Rheinboldt WC (1970) Iterative solution of nonlinear equations in several variables. Academic Press, New York
[14] Powell MJD (1970) A new algorithm for unconstrained optimization. In: Rosen JB, Mangasarian OL, Ritter K (eds) Nonlinear programming. Academic Press, New York, pp 31–66
[15] Saad Y (2003) Iterative methods for sparse linear systems 2nd edn. Springer, New York · Zbl 1031.65046
[16] Steihaug T (1983) The conjugate gradient method and trust regions in large scale optimization. SIAM J Numer Anal 20:626–637 · Zbl 0518.65042 · doi:10.1137/0720042
[17] Stoer J, Yuan Y (1995) A subspace study on conjugate gradient algorithms. Z Angew Math Mech 75:69–77 · Zbl 0823.65061 · doi:10.1002/zamm.19950751207
[18] Sun WY, Yuan Y (2006) Optimization theory and methods: nonlinear programming. Springer optimization and its application, vol. 1. Springer, Berlin
[19] Toint PhL (1981) Towards an efficient sparsity exploiting Newton method for minimization. In: Duff I (ed) Sparse matrices and their uses. Academic Press, New York, pp 57–88
[20] Toint PhL (1987) On large scale nonlinear least squares calculations. SIAM J Sci Stat Comput 8(3):416–435 · Zbl 0616.65014 · doi:10.1137/0908042
[21] Vlček J, Lukšan L (2002) New variable metric methods for unconstrained minimization covering the large-scale case, Technical report No. V876, Institute of Computer Science, Academy of Sciences of the Czech Republic, October 2002
[22] Wang ZH, Yuan YX (2006) A subspace implementation of quasi-Newton trust region methods for unconstrained optimization. Numer Math 104:241–269 · Zbl 1101.65066 · doi:10.1007/s00211-006-0021-6
[23] Wang ZH, Wen ZW, Yuan YX (2004) A subspace trust region method for large scale unconstrained optimization. In: Yuan YX (ed) Numerical linear algebra and optimization. Science Press, Marrickvill, pp 264–274
[24] Yuan YX (1998) Trust region algorithms for nonlinear equations. Information 1:7–20 · Zbl 1006.65048
[25] Yuan YX (2000a) A review of trust region algorithms for optimization. In: Ball JM, Hunt JCR (eds) ICM99: proceedings of the fourth international congress on industrial and applied mathematics. Oxford University Press, Oxford, pp 271–282 · Zbl 0992.65061
[26] Yuan YX (2000b) On the truncated conjugate gradient method. Math Program 86:561–571 · Zbl 0955.65039 · doi:10.1007/s101070050012
[27] Yuan YX (2007) Subspace techniques for nonlinear optimization. In: Jeltsch R, Li DQ, Sloan IH (eds) Some topics in industrial and applied mathematics. Series in contemporary applied mathematics, vol 8. Higher Education Press, Beijing, pp 206–218
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.