zbMATH — the first resource for mathematics

Pseudo expected improvement criterion for parallel EGO algorithm. (English) Zbl 1377.90069
Summary: The efficient global optimization (EGO) algorithm is famous for its high efficiency in solving computationally expensive optimization problems. However, the expected improvement (EI) criterion used for picking up candidate points in the EGO process produces only one design point per optimization cycle, which is time-wasting when parallel computing can be used. In this work, a new criterion called pseudo expected improvement (PEI) is proposed for developing parallel EGO algorithms. In each cycle, the first updating point is selected by the initial EI function. After that, the PEI function is built to approximate the real updated EI function by multiplying the initial EI function by an influence function of the updating point. The influence function is designed to simulate the impact that the updating point will have on the EI function, and is only corresponding to the position of the updating point (not the function value of the updating point). Therefore, the next updating point can be identified by maximizing the PEI function without evaluating the first updating point. As the sequential process goes on, a desired number of updating points can be selected by the PEI criterion within one optimization cycle. The efficiency of the proposed PEI criterion is validated by six benchmarks with dimension from 2 to 6. The results show that the proposed PEI algorithm performs significantly better than the standard EGO algorithm, and gains significant improvements over five of the six test problems compared against a state-of-the-art parallel EGO algorithm. Furthermore, additional experiments show that it affects the convergence of the proposed algorithm significantly when the global maximum of the PEI function is not found. It is recommended to use as much evaluations as one can afford to find the global maximum of the PEI function.

90C26 Nonconvex programming, global optimization
90C59 Approximation methods and heuristics in mathematical programming
Full Text: DOI
[1] Rios, LM; Sahinidis, NV, Derivative-free optimization: a review of algorithms and comparison of software implementations, J. Glob. Optim., 56, 1247-1293, (2013) · Zbl 1272.90116
[2] Boukouvala, F; Misener, R; Floudas, CA, Global optimization advances in mixed-integer nonlinear programming, MINLP, and constrained derivative-free optimization, CDFO, Eur. J. Oper. Res., 252, 701-727, (2016) · Zbl 1346.90677
[3] Wang, GG; Shan, S, Review of metamodeling techniques in support of engineering design optimization, J. Mech. Des., 129, 370-380, (2007)
[4] Viana, FAC; Simpson, TW; Balabanov, V; Toropov, V, Metamodeling in multidisciplinary design optimization: how far have we really come?, AIAA J., 52, 670-690, (2014)
[5] Jones, DR; Schonlau, M; Welch, WJ, Efficient global optimization of expensive black-box functions, J. Glob. Optim., 13, 455-492, (1998) · Zbl 0917.90270
[6] Jones, DR, A taxonomy of global optimization methods based on response surfaces, J. Glob. Optim., 21, 345-383, (2001) · Zbl 1172.90492
[7] Sasena, MJ; Papalambros, P; Goovaerts, P, Exploration of metamodeling sampling criteria for constrained global optimization, Eng. Optim., 34, 263-278, (2002)
[8] Parr, JM; Keane, AJ; Forrester, AIJ; Holden, CME, Infill sampling criteria for surrogate-based optimization with constraint handling, Eng. Optim., 44, 1147-1166, (2012) · Zbl 1250.90089
[9] Huang, D; Allen, TT; Notz, WI; Zeng, N, Global optimization of stochastic black-box systems via sequential Kriging meta-models, J. Glob. Optim., 34, 441-466, (2006) · Zbl 1098.90097
[10] Forrester, AI; Keane, AJ; Bressloff, NW, Design and analysis of noisy computer experiments, AIAA J, 44, 2331-2339, (2006)
[11] Knowles, J, Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems, IEEE Trans. Evolut. Comput., 10, 50-66, (2006)
[12] Couckuyt, I; Deschrijver, D; Dhaene, T, Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization, J. Glob. Optim., 60, 575-594, (2014) · Zbl 1303.90093
[13] Ginsbourger, D; Riche, R; Carraro, L; Tenne, Y (ed.); Goh, C-K (ed.), Kriging is well-suited to parallelize optimization, No. 2, 131-162, (2010), Berlin
[14] Sóbester, A; Leary, SJ; Keane, AJ, A parallel updating scheme for approximating and optimizing high fidelity computer simulations, Struct. Multidiscipl. Optim., 27, 371-383, (2004)
[15] Feng, ZW; Zhang, QB; Zhang, QF; Tang, QG; Yang, T; Ma, Y, A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization, J. Glob. Optim., 61, 677-694, (2015) · Zbl 1323.90045
[16] Forrester, A., Sóbester, A., Keane, A.: Engineering Design Via Surrogate Aodelling: A Practical Guide. Wiley, New York (2008)
[17] Bischl, B., Wessing, S., Bauer, N., Friedrichs, K., Weihs, C.: MOI-MBO: multiobjective infill for parallel Mmodel-based optimization. In: Pardalos, P.M., Resende, M.G.C., Vogiatzis, C., Walteros, J.L. (eds.) Learning and Intelligent Optimization. Lecture Notes in Computer Science, pp. 173-186. Springer, Berlin (2014) · Zbl 1098.90097
[18] Hamza, K; Shalaby, M, A framework for parallelized efficient global optimization with application to vehicle crashworthiness optimization, Eng. Optim., 46, 1200-1221, (2014)
[19] Hutter, F., Hoos, H., Leyton-Brown, K.: Parallel algorithm configuration. In: Hamadi, Y., Schoenauer, M. (eds.) Learning and Intelligent Optimization. Lecture Notes in Computer Science, pp. 55-70. Springer, Berlin (2012)
[20] Viana, FA; Haftka, RT; Watson, LT, Efficient global optimization algorithm assisted by multiple surrogate techniques, J. Glob. Optim., 56, 669-689, (2013) · Zbl 1275.90072
[21] Sacks, J; Welch, WJ; Mitchell, TJ; Wynn, HP, Design and analysis of computer experiments, Stat. Sci., 4, 409-423, (1989) · Zbl 0955.62619
[22] Torn, A., Zilinskas, A.: Global Optimization. Springer, Berlin (1987) · Zbl 0752.90075
[23] Dixon, LCW; Szego, GP; Dixon, LCW (ed.); Szego, GP (ed.), The optimization problem: an introduction, (1978), New York
[24] Sasena, M.J.: Flexibility and Efficiency Enhancements for Constrained Global Design Optimization with Kriging Approximations. University of Michigan, Ann Arbor (2002)
[25] Lophaven, S.N., Nielsen, H.B., Søndergaard, J.: Dace—a matlab Kriging toolbox. Technical Report IMM TR C2002 C12, Technical University of Denmark, Denmark (2002). http://www2.imm.dtu.dk/ hbn/dace/
[26] Viana F.A.C.: SURROGATES Toolbox Users Guide. Gainesville, FL, USA, version 3.0 edn. (2011). http://sites.google.com/site/felipeacviana/surrogatestoolbox · Zbl 0917.90270
[27] Price, K.V., Storn, R.M., Lampinen, J.A.: Differential Evolution: A Practical Approach to Global Optimization. Springer, Berlin (2005). http://www1.icsi.berkeley.edu/ storn/code.html · Zbl 1186.90004
[28] Barr, RS; Hickman, BL, Reporting computational experiments with parallel algorithms: issues, measures, and experts’ opinions, ORSA J. Comput., 5, 2-18, (1993) · Zbl 0775.65029
[29] Regis, RG; Shoemaker, CA, Parallel radial basis function methods for the global optimization of expensive functions, Eur. J. Oper. Res., 182, 514-535, (2007) · Zbl 1178.90279
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.