×

zbMATH — the first resource for mathematics

A Bayesian approach to constrained single- and multi-objective optimization. (English) Zbl 1390.90441
Summary: This article addresses the problem of derivative-free (single- or multi-objective) optimization subject to multiple inequality constraints. Both the objective and constraint functions are assumed to be smooth, non-linear and expensive to evaluate. As a consequence, the number of evaluations that can be used to carry out the optimization is very limited, as in complex industrial design optimization problems. The method we propose to overcome this difficulty has its roots in both the Bayesian and the multi-objective optimization literatures. More specifically, an extended domination rule is used to handle objectives and constraints in a unified way, and a corresponding expected hyper-volume improvement sampling criterion is proposed. This new criterion is naturally adapted to the search of a feasible point when none is available, and reduces to existing Bayesian sampling criteria – the classical Expected Improvement (EI) criterion and some of its constrained/multi-objective extensions – as soon as at least one feasible point is available. The calculation and optimization of the criterion are performed using Sequential Monte Carlo techniques. In particular, an algorithm similar to the subset simulation method, which is well known in the field of structural reliability, is used to estimate the criterion. The method, which we call BMOO (for Bayesian Multi-Objective Optimization), is compared to state-of-the-art algorithms for single- and multi-objective constrained optimization.

MSC:
90C26 Nonconvex programming, global optimization
90C29 Multi-objective and goal programming
PDF BibTeX XML Cite
Full Text: DOI
References:
[1] Andrieu, C; Roberts, GO, The pseudo-marginal approach for efficient Monte Carlo computations, Ann. Stat., 37, 697-725, (2009) · Zbl 1185.60083
[2] Andrieu, C; Thoms, J, A tutorial on adaptive mcmc, Stat. Comput., 18, 343-373, (2008)
[3] Archetti, F; Betrò, B, A probabilistic algorithm for global optimization, CALCOLO, 16, 335-343, (1979) · Zbl 0428.90064
[4] Au, S-K; Beck, JL, Estimation of small failure probabilities in high dimensions by subset simulation, Probab. Eng. Mech, 16, 263-277, (2001)
[5] Bader, J; Zitzler, E, Hype: an algorithm for fast hypervolume-based many-objective optimization, Evolut. Comput., 19, 45-76, (2011)
[6] Bautista, D.C.: A Sequential Design for Approximating the Pareto Front Using the Expected Pareto Improvement Function. PhD thesis, The Ohio State University (2009)
[7] Bect, J; Ginsbourger, D; Li, L; Picheny, V; Vazquez, E, Sequential design of computer experiments for the estimation of a probability of failure, Stat. Comput, 22, 773-793, (2012) · Zbl 1252.62081
[8] Bect, J., Vazquez, E. et al.: STK: a Small (Matlab/Octave) Toolbox for Kriging. Release 2.4 (to appear), (2016). URL http://kriging.sourceforge.net
[9] Benassi, R.: Nouvel Algorithme d’optimisation Bayésien Utilisant une Approche Monte-Carlo séquentielle. PhD thesis, Supélec (2013)
[10] Benassi, R., Bect, J., Vazquez, E.: Bayesian optimization using sequential Monte Carlo. In: Learning and Intelligent Optimization. 6th International Conference, LION 6, Paris, France, 16-20 January 2012, Revised Selected Papers, volume 7219 of Lecture Notes in Computer Science, pp. 339-342. Springer (2012)
[11] Beume, N, S-metric calculation by considering dominated hypervolume as klee’s measure problem, Evolut. Comput., 17, 477-492, (2009)
[12] Binois, M., Picheny, V.: GPareto: Gaussian Processes for Pareto Front Estimation and Optimization, 2015. URL http://CRAN.R-project.org/package=GPareto. R package version 1.0.1 · Zbl 1252.62081
[13] Box, GEP; Cox, DR, An analysis of transformations, J. Roy. Stat. Soc. Series B (Methodological), 26, 211-252, (1964) · Zbl 0156.40104
[14] Cérou, F; Moral, P; Furon, T; Guyader, A, Sequential Monte Carlo for rare event estimation, Stat. Comput., 22, 795-808, (2012) · Zbl 1252.62083
[15] Chafekar, D., Xuan, J., Rasheed, K.: Constrained multi-objective optimization using steady state genetic algorithms. In: Genetic and Evolutionary Computation-GECCO 2003, pp. 813-824. Springer (2003) · Zbl 1028.68702
[16] Chevalier, C; Bect, J; Ginsbourger, D; Vazquez, E; Picheny, V; Richet, Y, Fast parallel Kriging-based stepwise uncertainty reduction with application to the identification of an excursion set, Technometrics, 56, 455-465, (2014)
[17] Conn, AR; Gould, NIM; Toint, P, A globally convergent augmented Lagrangian algorithm for optimization with general constraints and simple bounds, SIAM J. Numer. Anal., 28, 545-572, (1991) · Zbl 0724.65067
[18] Couckuyt, I; Deschrijver, D; Dhaene, T, Fast calculation of multiobjective probability of improvement and expected improvement criteria for Pareto optimization, J. Glob. Optim., 60, 575-594, (2014) · Zbl 1303.90093
[19] Damianou, A., Lawrence, N.: Deep gaussian processes. In: Proceedings of the Sixteenth International Conference on Artificial Intelligence and Statistics, pp. 207-215 (2013)
[20] Deb, K; Pratap, A; Agarwal, S; Meyarivan, T, A fast and elitist multiobjective genetic algorithm: NSGA-II, Evolut. Comput., IEEE Trans., 6, 182-197, (2002)
[21] Moral, P; Doucet, A; Jasra, A, Sequential Monte Carlo samplers, J. R. Stat. Soc. Ser. B (Stat. Methodol.), 68, 411-436, (2006) · Zbl 1105.62034
[22] Douc, R., Cappé, O.: Comparison of resampling schemes for particle filtering. In: Image and Signal Processing and Analysis, 2005. ISPA 2005. Proceedings of the 4th International Symposium on, pp. 64-69. IEEE (2005)
[23] Emmerich, M.: Single- and Multi-Objective Evolutionary Design Optimization Assisted by Gaussian Random Field Metamodels. PhD thesis, Technical University Dortmund (2005)
[24] Emmerich, M., Klinkenberg, J.W.: The Computation of the Expected Improvement in Dominated Hypervolume of Pareto Front Approximations, Technical report. Leiden University (2008)
[25] Emmerich, M; Giannakoglou, KC; Naujoks, B, Single- and multi-objective evolutionary optimization assisted by Gaussian random field metamodels, IEEE Trans. Evolut. Comput., 10, 421-439, (2006)
[26] Fonseca, CM; Fleming, PJ, Multiobjective optimization and multiple constraint handling with evolutionary algorithms. I. A unified formulation, IEEE Trans. Syst., Man Cybern. Part A Syst. Hum., 28, 26-37, (1998)
[27] Forrester, A.I.J., Sobester, A., Keane, A.J.: Engineering Design via Surrogate Modelling: a Practical Guide. Wiley, Chichester (2008)
[28] Gelbart, M.A.: Constrained Bayesian Optimization and Applications. PhD thesis, Harvard University, Graduate School of Arts and Sciences (2015)
[29] Gelbart, M.A., Snoek, J., Adams, R.P.: Bayesian optimization with unknown constraints. arXiv preprint arXiv:1403.5607 (2014)
[30] Ginsbouger, D., Le Riche, R.: Towards Gaussian process-based optimization with finite time horizon. In: Invited talk at the 6th Autumn Symposium of the “Statistical Modelling” Research Training Group, 21 November (2009)
[31] Gramacy, R.B., Lee, H.: Optimization under unknown constraints. In: Bayesian Statistics 9. Proceedings of the Ninth Valencia International Meeting, pp. 229-256. Oxford University Press (2011)
[32] Gramacy, R.B., Gray, G.A., Le Digabel, S., Lee, H.K.H., Ranjan, P., Wells, G., Wild, S.M.: Modeling an augmented lagrangian for blackbox constrained optimization. Technometrics, arXiv preprint arXiv:1403.4890 (2015) · Zbl 0917.90270
[33] Hernández-Lobato, D., Hernández-Lobato, J.M., Shah, A., Adams, R.P.: Predictive entropy search for multi-objective bayesian optimization. arXiv preprint arXiv:1511.05467 (2015a)
[34] Hernández-Lobato, J.M., Gelbart, M.A., Hoffman, M.W., Adams, R.P., Ghahramani, Z.: Predictive entropy search for bayesian optimization with unknown constraints. In: Proceedings of the 32nd International Conference on Machine Learning, Lille, France, 2015. JMLR: W&CP volume 37 (2015b)
[35] Hernández-Lobato, J.M., Gelbart, M.A., Adams, R.P., Hoffman, M.W., Ghahramani, Z.: A general framework for constrained bayesian optimization using information-based search. arXiv preprint arXiv:1511.09422 (2015) · Zbl 1391.90641
[36] Horn, D., Wagner, T., Biermann, D., Weihs, C., Bischl, B.: Model-based multi-objective optimization: taxonomy, multi-point proposal, toolbox and benchmark. In: Evolutionary Multi-Criterion Optimization, pp. 64-78. Springer (2015)
[37] Hupkens, I., Emmerich, M., Deutz, A.: Faster computation of expected hypervolume improvement. arXiv preprint arXiv:1408.7114 (2014)
[38] Jeong, S., Obayashi, S.: Efficient global optimization (ego) for multi-objective problem and data mining. In: Evolutionary Computation, 2005. The 2005 IEEE Congress on, vol. 3, pp. 2138-2145 (2005)
[39] Jeong, S; Minemura, Y; Obayashi, S, Optimization of combustion chamber for diesel engine using Kriging model, J. Fluid Sci. Technol., 1, 138-146, (2006)
[40] Jin, Y, Surrogate-assisted evolutionary computation: recent advances and future challenges, Swarm Evolut. Comput., 1, 61-70, (2011)
[41] Johnson, S.G.: The nlopt nonlinear-optimization package (version 2.3). URL http://ab-initio.mit.edu/nlopt (2012)
[42] Jones, DR; Schonlau, M; Welch, WJ, Efficient global optimization of expensive black-box functions, J. Glob. Optim., 13, 455-492, (1998) · Zbl 0917.90270
[43] Keane, AJ, Statistical improvement criteria for use in multiobjective design optimization, AIAA J., 44, 879-891, (2006)
[44] Knowles, J, Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems, Evolut. Comput., IEEE Trans., 10, 50-66, (2006)
[45] Knowles, J., Hughes, E.J.: Multiobjective optimization on a budget of 250 evaluations. In: Evolutionary Multi-Criterion Optimization, pp. 176-190. Springer (2005) · Zbl 1109.68615
[46] Kushner, HJ, A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise, J. Fluids Eng., 86, 97-106, (1964)
[47] Li, L.: Sequential Design of Experiments to Estimate a Probability of Failure. PhD thesis, Supélec (2012)
[48] Li, L., Bect, J., Vazquez, E.: Bayesian subset simulation: a kriging-based subset simulation algorithm for the estimation of small probabilities of failure. In: Proceedings of PSAM 11 & ESREL 2012, 25-29 June 2012, Helsinki. IAPSAM (2012)
[49] Liu, J.S.: Monte Carlo Strategies in Scientific Computing. Springer, New York (2001) · Zbl 0991.65001
[50] Loeppky, J.L., Sacks, J., Welch, W.J.: Choosing the sample size of a computer experiment: A practical guide. Technometrics 51(4) (2009)
[51] Mockus, J.: On Bayesian methods of optimization. In: Towards Global Optimization, pp. 166-181. North-Holland (1975)
[52] Mockus, J.: Bayesian Approach to Global Optimization: Theory and Applications, vol. 37. Kluwer, Dordrecht (1989) · Zbl 0693.49001
[53] Mockus, J; Tiesis, V; Žilinskas, A; Dixon, L C W (ed.); Szegö, GP (ed.), The application of Bayesian methods for seeking the extremum, No. 2, 117-129, (1978), New York
[54] Oyama, A; Shimoyama, K; Fujii, K, New constraint-handling method for multi-objective and multi-constraint evolutionary optimization, Trans. Jpn. Soc. Aeronaut. Space Sci., 50, 56-62, (2007)
[55] Parr, JM; Keane, AJ; Forrester, AIJ; Holden, CME, Infill sampling criteria for surrogate-based optimization with constraint handling, Eng. Optim., 44, 1147-1166, (2012) · Zbl 1250.90089
[56] Picheny, V.: A stepwise uncertainty reduction approach to constrained global optimization. In: Proceedings of the 17th International Conference on Artificial Intelligence and Statistics (AISTATS), 2014, Reykjavik, Iceland, vol. 33, pp. 787-795. JMLR: W&CP (2014a)
[57] Picheny, V.: Multiobjective optimization using Gaussian process emulators via stepwise uncertainty reduction. Stat. Comput. (2014b). doi:10.1007/s11222-014-9477-x:1-16 · Zbl 1331.90102
[58] Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted \({\cal S}\)-metric selection. In: Parallel Problem Solving from Nature (PPSN X), vol. 5199 of Lecture Notes in Computer Science, pp. 784-794. Springer (2008)
[59] Powell, M.J.D.: A direct search optimization method that models the objective and constraint functions by linear interpolation. In: Advances in Optimization and Numerical Analysis, pp. 51-67. Springer (1994) · Zbl 0826.90108
[60] Ray, T; Tai, K; Seow, KC, Multiobjective design optimization by an evolutionary algorithm, Eng. Optim., 33, 399-424, (2001)
[61] Regis, RG, Constrained optimization by radial basis function interpolation for high-dimensional expensive black-box problems with infeasible initial points, Eng. Optim., 46, 218-243, (2014)
[62] Robert, C., Casella, G.: Monte Carlo Statistical Methods, 2nd edn. Springer, New York (2004) · Zbl 1096.62003
[63] Roberts, GO; Rosenthal, JS, Examples of adaptive mcmc, J. Comput. Graph. Stat., 18, 349-367, (2009)
[64] Santner, T.J., Williams, B.J., Notz, W.: The design and Analysis of Computer Experiments. Springer, New York (2003) · Zbl 1041.62068
[65] Sasena, M.J.: Flexibility and Efficiency Enhancements for Constrained Global Design Optimization with Kriging Approximations. PhD thesis, University of Michigan (2002) · Zbl 1252.62081
[66] Sasena, MJ; Papalambros, P; Goovaerts, P, Exploration of metamodeling sampling criteria for constrained global optimization, Eng. Optim., 34, 263-278, (2002)
[67] Schonlau, M., Welch, W.J., Jones, D.R.: Global versus local search in constrained optimization of computer models. In: New Developments and Applications in Experimental Design: Selected Proceedings of a 1997 Joint AMS-IMS-SIAM Summer Conference, vol. 34 of IMS Lecture Notes-Monographs Series, pp. 11-25. Institute of Mathematical Statistics (1998)
[68] Shimoyama, K; Sato, K; Jeong, S; Obayashi, S, Updating Kriging surrogate models based on the hypervolume indicator in multi-objective optimization, J. Mech. Des., 135, 094503, (2013)
[69] Snelson, E; Rasmussen, CE; Ghahramani, Z, Warped Gaussian processes, Adv. Neural Inf. Process. Syst., 16, 337-344, (2004)
[70] Stein, M.L.: Interpolation of Spatial Data: Some Theory for Kriging. Springer, New York (1999) · Zbl 0924.62100
[71] Svenson, J.D., Santner, T.J.: Multiobjective optimization of expensive black-box functions via expected maximin improvement. Technical report, Tech. rep., 43210, Ohio University, Columbus (2010)
[72] Toal, DJJ; Keane, AJ, Non-stationary Kriging for design optimization, Eng. Optim., 44, 741-765, (2012)
[73] Vazquez, E., Bect, J.: A new integral loss function for Bayesian optimization. arXiv preprint arXiv:1408.4622, (2014)
[74] Villemonteix, J; Vazquez, E; Walter, E, An informational approach to the global optimization of expensive-to-evaluate functions, J. Glob. Optim., 44, 509-534, (2009) · Zbl 1180.90253
[75] Wagner, T., Emmerich, M., Deutz, A., Ponweiser, W.: On expected-improvement criteria for model-based multi-objective optimization. In: Parallel Problem Solving from Nature, PPSN XI. 11th International Conference, Krakov, Poland, 11-15 September 2010, Proceedings, Part I, vol. 6238 of Lecture Notes in Computer Science, pp. 718-727. Springer (2010)
[76] Williams, BJ; Santner, TJ; Notz, WI; Lehman, JS; Kneib, T (ed.); Tutz, G (ed.), Sequential design of computer experiments for constrained optimization, 449-472, (2010), HD
[77] Williams, C.K.I., Rasmussen, C.: Gaussian Processes for Machine Learning. The MIT Press, Cambridge (2006) · Zbl 1177.68165
[78] Zhang, Q; Liu, W; Tsang, E; Virginas, B, Expensive multiobjective optimization by MOEA/D with Gaussian process model, Evolut. Comput., IEEE Trans., 14, 456-474, (2010)
[79] Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the Strength Pareto Evolutionary Algorithm for Multiobjective Optimization. In K. C. Giannakoglou et al., (ed.) Evolutionary Methods for Design, Optimisation and Control with Application to Industrial Problems (EUROGEN 2001), pp. 95-100. International Center for Numerical Methods in Engineering (CIMNE) 2002
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.