×

zbMATH — the first resource for mathematics

Efficient multiobjective optimization employing Gaussian processes, spectral sampling and a genetic algorithm. (English) Zbl 1406.90095
J. Glob. Optim. 71, No. 2, 407-438 (2018); correction ibid. 71, No. 2, 439–440 (2018).
This paper proposes an extension of the well-known Thompson sampling method for multi-armed bandit problems to approximate Pareto sets in continuous multiobjective optimization in a small number of function evaluations. In the proposed algorithm, each objective function is modelled by an independent Gaussian process model, which allows to incorporate a priori knowledge on the objective function by the choice of the covariance function, such as continuity or smoothness. The parameters are adjusted using the data available by maximum likelihood estimation. Given the fitted Gaussian process model, a rule based on the Thompson sampling method is used to choose the next point to evaluate the vector of objectives. The specific implementation of the algorithm presented in Section 6 is illustrated in Section 7 with an application to a simple bi-objective problem.
The paper is hardly readable as a number of figures have moved to the wrong positions and have the wrong caption. A correction has been published in the same journal, see [E. Bradford et al., J. Glob. Optim. 71, No. 2, 439–440 (2018; Zbl 06905225)].

MSC:
90C26 Nonconvex programming, global optimization
90C29 Multi-objective and goal programming
60G15 Gaussian processes
90C59 Approximation methods and heuristics in mathematical programming
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Farmani, R; Savic, D; Walters, G, Evolutionary multi-objective optimization in water distribution network design, Eng. Optim., 37, 167-183, (2005)
[2] Censor, Y, Pareto optimality in multiobjective problems, Appl. Math. Optim., 4, 41-59, (1977) · Zbl 0346.90055
[3] Zitzler, E; Thiele, L, Multiobjective evolutionary algorithms: a comparative case study and the strength Pareto approach, IEEE Trans. Evol. Comput., 3, 257-271, (1999)
[4] Ehrgott, M, Multiobjective optimization, AI Mag., 29, 47, (2009) · Zbl 1191.90054
[5] Hwang, C-R, Simulated annealing: theory and applications, Acta Appl. Math., 12, 108-111, (1988)
[6] Davis, L.: Handbook of Genetic Algorithms. Van Nostrand Reinhold, New York (1991)
[7] Kennedy, J; Sammut, C (ed.); Webb, GI (ed.), Particle swarm optimization, 760-766, (2011), Berlin
[8] Sacks, J; Welch, WJ; Mitchell, TJ; Wynn, HP, Design and analysis of computer experiments, Stat. Sci., 4, 409-423, (1989) · Zbl 0955.62619
[9] Simpson, TW; Booker, AJ; Ghosh, D; Giunta, AA; Koch, PN; Yang, R-J, Approximation methods in multidisciplinary analysis and optimization: a panel discussion, Struct. Multidiscipl. Optim., 27, 302-313, (2004)
[10] McKay, MD; Beckman, RJ; Conover, WJ, A comparison of three methods for selecting values of input variables in the analysis of output from a computer code, Technometrics, 42, 55-61, (2000) · Zbl 0415.62011
[11] Shahriari, B; Swersky, K; Wang, Z; Adams, RP; Freitas, N, Taking the human out of the loop: a review of Bayesian optimization, Proc. IEEE, 104, 148-175, (2016)
[12] Jones, DR; Schonlau, M; Welch, WJ, Efficient global optimization of expensive black-box functions, J. Glob. Optim., 13, 455-492, (1998) · Zbl 0917.90270
[13] Močkus, J.: On Bayesian methods for seeking the extremum. In: Optimization Techniques IFIP Technical Conference, pp. 400-404. Springer (1975)
[14] Łaniewski-Wołłk, Ł.: Relative Expected Improvement in Kriging Based Optimization. arXiv preprint arXiv:0908.3321 (2009)
[15] Jones, DR, A taxonomy of global optimization methods based on response surfaces, J. Glob. Optim., 21, 345-383, (2001) · Zbl 1172.90492
[16] Forrester, AI; Keane, AJ, Recent advances in surrogate-based optimization, Prog. Aerosp. Sci., 45, 50-79, (2009)
[17] Voutchkov, I., Keane, A.: Multi-objective optimization using surrogates. In: Tenne, Y., Goh, C.K. (eds.) Computational Intelligence in Optimization, pp. 155-175. Springer, Berlin (2010) · Zbl 1200.90151
[18] Deb, K; Pratap, A; Agarwal, S; Meyarivan, T, A fast and elitist multiobjective genetic algorithm: NSGA-II, IEEE Trans. Evol. Comput., 6, 182-197, (2002)
[19] Knowles, J, Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems, IEEE Trans. Evol. Comput., 10, 50-66, (2006)
[20] Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted\(\ \) mathcal S-Metric selection. In: International Conference on Parallel Problem Solving from Nature, pp. 784-794. Springer (2008)
[21] Kushner, HJ, A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise, J. Basic Eng., 86, 97-106, (1964)
[22] Keane, AJ, Statistical improvement criteria for use in multiobjective design optimization, AIAA J., 44, 879-891, (2006)
[23] Emmerich, M., Klinkenberg, J.-W.: The computation of the expected improvement in dominated hypervolume of Pareto front approximations. Rapport technique, Leiden University (2008)
[24] Emmerich, M.: Single-and multi-objective evolutionary design optimization assisted by gaussian random field metamodels. Ph.D. thesis, University of Dortmund (2005)
[25] Emmerich, M., Deutz, A., Klinkenberg, J.-W.: Hypervolume-based expected improvement: monotonicity properties and exact computation. In: IEEE Congress on Evolutionary Computation, pp. 2147-2154. IEEE (2011) · Zbl 0627.62010
[26] Hupkens, I., Emmerich, M., Deutz, A.: Faster Computation of Expected Hypervolume Improvement. arXiv preprint arXiv:1408.7114 (2014)
[27] Emmerich, M., Yang, K., Deutz, A., Wang, H., Fonseca, C.M.: A multicriteria generalization of bayesian global optimization. In: Pardalos, P., Zhigljavsky, A., Žilinskas, J. (eds.) Advances in Stochastic and Deterministic Global Optimization, pp. 229-242. Springer, Berlin (2016) · Zbl 1355.90071
[28] Yang, K., Emmerich, M., Deutz, A., Fonseca, C.M.: Computing 3-D expected hypervolume improvement and related integrals in asymptotically optimal time. In: International Conference on Evolutionary Multi-Criterion Optimization, pp. 685-700. Springer (2017)
[29] Thompson, WR, On the likelihood that one unknown probability exceeds another in view of the evidence of two samples, Biometrika, 25, 285-294, (1933) · JFM 59.1159.03
[30] Zhigljavsky, A., Zilinskas, A.: Stochastic Global Optimization, vol. 9. Springer, Berlin (2007) · Zbl 1136.90003
[31] Žilinskas, A, A statistical model-based algorithm for ‘black-box’multi-objective optimisation, Int. J. Syst. Sci., 45, 82-93, (2014) · Zbl 1307.93476
[32] Helmdach, D; Yaseneva, P; Heer, PK; Schweidtmann, AM; Lapkin, A, A multi-objective optimisation including results of life cycle assessment in developing bio-renewables-based processes, ChemSusChem, 10, 3632-3643, (2017)
[33] Rasmussen, C.E.: Gaussian processes in machine learning. In: Bousquet, O., VonLuxburg, U., Ratsch, G. (eds.) Advanced Lectures on Machine Learning. Lecture Notes in Artificial Intelligence, vol. 3176, pp. 63-71 (2004) · Zbl 1120.68436
[34] Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2005) · Zbl 1177.68165
[35] Ebden, M.: Gaussian processes: a quick introduction. arXiv preprint arXiv:1505.02965 (2015) · Zbl 1108.62327
[36] Rasmussen, C.E.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006) · Zbl 1177.68165
[37] Yang, X; Maciejowski, JM, Fault tolerant control using Gaussian processes and model predictive control, Int. J. Appl. Math. Comput. Sci., 25, 133-148, (2015) · Zbl 1322.93044
[38] Matérn, B.: Spatial Variation, vol. 36. Springer, Berlin (2013) · Zbl 0608.62122
[39] Sundararajan, S; Keerthi, SS, Predictive approaches for choosing hyperparameters in Gaussian processes, Neural Comput., 13, 1103-1118, (2001) · Zbl 1108.62327
[40] Hernández-Lobato, J.M., Hoffman, M.W., Ghahramani, Z.: Predictive entropy search for efficient global optimization of black-box functions. In: Ghahramani, Z., Welling, M., Cortes, C., Lawrence, N.D., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems 27, vol. 1, pp. 918-926. MIT Press, Cambridge (2014)
[41] Bochner, S.: Lectures on Fourier Integrals, vol. 42. Princeton University Press, Princeton (1959) · Zbl 0085.31802
[42] Rahimi, A; Recht, B; Platt, JC (ed.); Koller, D (ed.); Singer, Y (ed.); Roweis, ST (ed.), Random features for large-scale kernel machines, 1177-1184, (2007), Cambridge
[43] Lázaro-Gredilla, M; Quiñonero-Candela, J; Rasmussen, CE; Figueiras-Vidal, AR, Sparse spectrum Gaussian process regression, J. Mach. Learn. Res., 11, 1865-1881, (2010) · Zbl 1242.62098
[44] Chapelle, O; Li, L; Shawe-Taylor, J (ed.); Zemel, RS (ed.); Bartlett, PL (ed.); Pereira, F (ed.); Weinberger, KQ (ed.), An empirical evaluation of Thompson sampling, 2249-2257, (2011), Cambridge
[45] Scott, SL, A modern Bayesian look at the multi-armed bandit, Appl. Stoch. Models Bus. Ind., 26, 639-658, (2010)
[46] Shahriari, B., Wang, Z., Hoffman, M.W., Bouchard-Côté, A., de Freitas, N.: An entropy search portfolio for Bayesian optimization. arXiv preprint arXiv:1406.4625 (2014)
[47] Kaufmann, E., Korda, N., Munos, R.: Thompson sampling: an asymptotically optimal finite-time analysis. In: International Conference on Algorithmic Learning Theory, pp. 199-213. Springer (2012) · Zbl 1386.91055
[48] Agrawal, S., Goyal, N.: Thompson sampling for contextual bandits with linear payoffs. In: 30th International Conference on Machine Learning (ICML-13), pp. 127-135 (2013)
[49] Yahyaa, S., Manderick, B.: Thompson sampling for multi-objective multi-armed bandits problem. In: 2015 European Symposium on Artificial Neural Networks, pp. 47-52. Presses universitaires de Louvain (2015)
[50] Stein, M, Large sample properties of simulations using Latin hypercube sampling, Technometrics, 29, 143-151, (1987) · Zbl 0627.62010
[51] Emmerich, M; Fonseca, C; Takahashi, RHC (ed.); Deb, K (ed.); Wanner, EF (ed.); Greco, S (ed.), Computing hypervolume contributions in low dimensions: asymptotically optimal algorithm and complexity results, 121-135, (2011), Berlin
[52] Bader, J., Deb, K., Zitzler, E.: Faster hypervolume-based search using Monte Carlo sampling. In: Multiple Criteria Decision Making for Sustainable Energy and Transportation Systems: Proceedings of the 19th International Conference on Multiple Criteria Decision Making. pp. 313-326. Springer, Berlin (2010) · Zbl 1184.90147
[53] Finkel, D.E.: DIRECT Optimization Algorithm User Guide. http://www4.ncsu.edu/ ctk/Finkel_Direct/DirectUserGuide_pdf.pdf. Accessed 26 Nov 2017 (2003)
[54] Lin, S.: NGPM a NSGA-II Program in Matlab. Codelooker. http://www.codelooker.com/dfilec/6987NGPMv11.4/NGPMmanualv1.4.pdf. Accessed 26 Nov 2017 (2011)
[55] Rasmussen, C.E., Nickisch, H.: The GPML Toolbox Version 3.5. http://mlg.eng.cam.ac.uk/carl/gpml/doc/manual.pdf. Accessed 26 Nov 2017 (2014) · Zbl 1242.68242
[56] Yi, C.: Pareto Set. Matlab File Exchange. https://se.mathworks.com/matlabcentral/fileexchange/15181-pareto-set. Accessed 26 Nov 2017 (2014)
[57] Yi, C.: Hypervolume Indicator. Matlab File Exchange. https://se.mathworks.com/matlabcentral/fileexchange/19651-hypervolume-indicator. Accessed 26 Nov 2017 (2008)
[58] Bradford, E., Schweidtmann, A.M.: TS-EMO. GitHub. https://github.com/Eric-Bradford/TS-EMO. Accessed 29 Nov 2017 (2017)
[59] Schaffer, J.D.: Some Experiments in Machine Learning Using Vector Evaluated Genetic Algorithms. Vanderbilt University, Nashville (1985)
[60] Cristescu, C.: ParEGO_Eigen. GitHub. https://github.com/CristinaCristescu/ParEGO_Eigen/graphs/contributors. Accessed 26 Nov 2017 (2015)
[61] Gorissen, D; Couckuyt, I; Demeester, P; Dhaene, T; Crombecq, K, A surrogate modeling and adaptive sampling toolbox for computer based design, J. Mach. Learn. Res, 11, 2051-2055, (2010)
[62] Cristescu, C., Knowles, J.: Surrogate-Based Multiobjective Optimization: ParEGO Update and Test · Zbl 0917.90270
[63] Zitzler, E; Thiele, L; Laumanns, M; Fonseca, CM; Fonseca, VG, Performance assessment of multiobjective optimizers: an analysis and review, IEEE Trans. Evol. Comput., 7, 117-132, (2003)
[64] Knowles, J; Thiele, L; Zitzler, E, A tutorial on the performance assessment of stochastic multiobjective optimizers, Tik Rep., 214, 327-332, (2006)
[65] Riquelme, N., Von Lücken, C., Baran, B.: Performance metrics in multi-objective optimization. In: Computing Conference (CLEI), 2015 Latin American, pp. 1-11. IEEE (2015)
[66] Ishibuchi, H., Masuda, H., Tanigaki, Y., Nojima, Y.: Modified Distance Calculation in Generational Distance and Inverted Generational Distance. Springer, Cham (2015)
[67] Zhou, A., Jin, Y., Zhang, Q., Sendhoff, B., Tsang, E.: Combining model-based and genetics-based offspring generation for multi-objective optimization using a convergence criterion. In: IEEE Congress on Evolutionary Computation, 2006 (CEC 2006), pp. 892-899. IEEE
[68] Zitzler, E.: Evolutionary Algorithms for Multiobjective Optimization: Methods and Applications. ETH Zurich, Zurich (1999)
[69] Jiang, S; Ong, Y-S; Zhang, J; Feng, L, Consistencies and contradictions of performance metrics in multiobjective optimization, IEEE Trans. Cybern., 44, 2391-2404, (2014)
[70] Fonseca, C.M., Fleming, P.J.: On the performance assessment and comparison of stochastic multiobjective optimizers. In: International Conference on Parallel Problem Solving from Nature, pp. 584-593. Springer (1996)
[71] Knowles, J.: A summary-attainment-surface plotting method for visualizing the performance of stochastic multiobjective optimizers. In: 5th International Conference on Intelligent Systems Design and Applications, 2005, pp. 552-557. IEEE (2005) · Zbl 0346.90055
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.