Data-driven spatial branch-and-bound algorithms for box-constrained simulation-based optimization. (English) Zbl 1484.90091

Summary: The ability to use complex computer simulations in quantitative analysis and decision-making is highly desired in science and engineering, at the same rate as computation capabilities and first-principle knowledge advance. Due to the complexity of simulation models, direct embedding of equation-based optimization solvers may be impractical and data-driven optimization techniques are often needed. In this work, we present a novel data-driven spatial branch-and-bound algorithm for simulation-based optimization problems with box constraints, aiming for consistent globally convergent solutions. The main contribution of this paper is the introduction of the concept data-driven convex underestimators of data and surrogate functions, which are employed within a spatial branch-and-bound algorithm. The algorithm is showcased by an illustrative example and is then extensively studied via computational experiments on a large set of benchmark problems.


90C26 Nonconvex programming, global optimization
90C57 Polyhedral combinatorics, branch-and-bound, branch-and-cut
Full Text: DOI


[1] Amaran, S.; Sahinidis, NV; Sharda, B.; Bury, SJ, Simulation optimization: a review of algorithms and applications, 4OR, 12, 4, 301-333 (2014) · Zbl 1317.90002
[2] Gosavi, A.; Gosavi, A., Background, Simulation-Based Optimization: Parametric Optimization Techniques and Reinforcement Learning, 1-12 (2015), US, Boston, MA: Springer, US, Boston, MA · Zbl 1321.90004
[3] Caballero, JA; Grossmann, IE, An algorithm for the use of surrogate models in modular flowsheet optimization, AIChE J., 54, 10, 2633-2650 (2008)
[4] Henao, CA; Maravelias, CT, Surrogate-based superstructure optimization framework, AIChE J., 57, 5, 1216-1232 (2011)
[5] Bhosekar, A.; Ierapetritou, M., Advances in surrogate based modeling, feasibility analysis, and optimization: a review, Comput. Chem. Eng., 108, 250-267 (2018)
[6] McBride, K.; Sundmacher, K., Overview of surrogate modeling in chemical process engineering, Chem. Ing. Tec., 91, 3, 228-239 (2019)
[7] Bajaj, I.; Hasan, MMF, Deterministic global derivative-free optimization of black-box problems with bounded Hessian, Optim. Lett. (2019) · Zbl 1439.90077
[8] Kieslich, CA; Boukouvala, F.; Floudas, CA, Optimization of black-box problems using Smolyak grids and polynomial approximations, J. Global Optim. (2018) · Zbl 1405.90107
[9] Cozad, A.; Sahinidis, NV; Miller, DC, Learning surrogate models for simulation-based optimization, AIChE J., 60, 6, 2211-2227 (2014)
[10] Schweidtmann, AM; Mitsos, A., Deterministic global optimization with artificial neural networks embedded, J. Optim. Theory Appl., 180, 3, 925-948 (2019) · Zbl 1407.90263
[11] Garud, SS; Mariappan, N.; Karimi, IA, Surrogate-based black-box optimisation via domain exploration and smart placement, Comput. Chem. Eng., 130, 106567 (2019)
[12] Nelder, JA; Mead, R., A simplex method for function minimization, Comput. J., 7, 4, 308-313 (1965) · Zbl 0229.65053
[13] Torczon, V., On the convergence of pattern search algorithms, SIAM J. Optim., 7, 1, 1-25 (1997) · Zbl 0884.65053
[14] Lewis, RM; Torczon, V., Pattern search algorithms for bound constrained minimization, SIAM J. Optim., 9, 4, 1082-1099 (1999) · Zbl 1031.90047
[15] Audet, C.; Dennis, JE, Mesh adaptive direct search algorithms for constrained optimization, SIAM J. Optim., 17, 1, 188-217 (2006) · Zbl 1112.90078
[16] Jones, DR; Floudas, CA; Pardalos, PM, Direct global optimization algorithm direct global optimization algorithm, Encyclopedia of Optimization, 725-735 (2009), US, Boston, MA: Springer, US, Boston, MA
[17] Jones, DR; Perttunen, CD; Stuckman, BE, Lipschitzian optimization without the Lipschitz constant, J. Optim. Theory Appl., 79, 1, 157-181 (1993) · Zbl 0796.49032
[18] Huyer, W.; Neumaier, A., Global optimization by multilevel coordinate search, J. Global Optim., 14, 4, 331-355 (1999) · Zbl 0956.90045
[19] Kirkpatrick, S.; Gelatt, CD; Vecchi, MP, Optimization by simulated annealing, Science, 220, 4598, 671-680 (1983) · Zbl 1225.90162
[20] Reeves, CR, Feature article-genetic algorithms for the operations researcher, INFORMS J. Comput., 9, 3, 231-250 (1997) · Zbl 0893.90145
[21] Whitley, D., A genetic algorithm tutorial, Stat. Comput., 4, 2, 65-85 (1994)
[22] Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of ICNN’95-International Conference on Neural Networks, vol. 1944, pp. 1942-1948, 27 November-1 December 1995
[23] Rios, LM; Sahinidis, NV, Derivative-free optimization: a review of algorithms and comparison of software implementations, J. Global Optim., 56, 3, 1247-1293 (2013) · Zbl 1272.90116
[24] Conn, A.R., Scheinberg, K., Vicente, L.N.: Introduction to Derivative-free Optimization. Society for Industrial and Applied Mathematics (SIAM, 3600 Market Street, Floor 6, Philadelphia, PA 19104), (2009) · Zbl 1163.49001
[25] Boukouvala, F.; Misener, R.; Floudas, CA, Global optimization advances in mixed-integer nonlinear programming, MINLP, and constrained derivative-free optimization CDFO, Eur. J. Oper. Res., 252, 3, 701-727 (2016) · Zbl 1346.90677
[26] Larson, J.; Menickelly, M.; Wild, SM, Derivative-free optimization methods, Acta Numer, 28, 287-404 (2019) · Zbl 1461.65169
[27] Audet, C., Hare, W.: The beginnings of DFO algorithms. In: Derivative-Free and Blackbox Optimization. pp. 33-54. Springer, Cham (2017)
[28] Cox, D.D., John, S.: A statistical method for global optimization. In: [Proceedings] 1992 IEEE International Conference on Systems, Man, and Cybernetics, vol. 1242, pp. 1241-1246, 18-21 October 1992
[29] Jones, DR; Schonlau, M.; Welch, WJ, Efficient global optimization of expensive black-box functions, J. Global Optim., 13, 4, 455-492 (1998) · Zbl 0917.90270
[30] Huyer, W.; Neumaier, A., SNOBFIT-stable noisy optimization by branch and fit, ACM Trans. Math. Softw., 35, 2, 1-25 (2008)
[31] Thebelt, A., Kronqvist, J., Mistry, M., Lee, R.M., Sudermann-Merx, N., Misener, R.: ENTMOOT: a framework for optimization over ensemble tree models. arXiv e-prints (2020).
[32] Boukouvala, F.; Floudas, CA, ARGONAUT: algorithms for global optimization of constrained grey-box computational problems, Optim. Lett., 11, 5, 895-913 (2017) · Zbl 1373.90113
[33] Müller, J.; Park, J.; Sahu, R.; Varadharajan, C.; Arora, B.; Faybishenko, B.; Agarwal, D., Surrogate optimization of deep neural networks for groundwater predictions, J. Global Optim. (2020) · Zbl 1471.86009
[34] Forrester, AIJ; Keane, AJ, Recent advances in surrogate-based optimization, Prog. Aerosp. Sci., 45, 1, 50-79 (2009)
[35] Barton, R.R., Meckesheimer, M.: Chapter 18 metamodel-based simulation optimization. In: Henderson, S.G., Nelson, B.L. (eds.) Handbooks in Operations Research and Management Science, vol. 13. pp. 535-574. Elsevier (2006)
[36] Eason, J.; Cremaschi, S., Adaptive sequential sampling for surrogate model generation with artificial neural networks, Comput. Chem. Eng., 68, 220-232 (2014)
[37] Tawarmalani, M.; Sahinidis, NV, A polyhedral branch-and-cut approach to global optimization, Math. Program., 103, 2, 225-249 (2005) · Zbl 1099.90047
[38] McKay, MD; Beckman, RJ; Conover, WJ, Comparison of three methods for selecting values of input variables in the analysis of output from a computer code, Technometrics, 21, 2, 239-245 (1979) · Zbl 0415.62011
[39] Hüllen, G.; Zhai, J.; Kim, SH; Sinha, A.; Realff, MJ; Boukouvala, F., Managing uncertainty in data-driven simulation-based optimization, Comput. Chem. Eng. (2019)
[40] Misener, R.; Floudas, CA, ANTIGONE: algorithms for continuous/integer global optimization of nonlinear equations, J. Global Optim., 59, 2, 503-526 (2014) · Zbl 1301.90063
[41] Androulakis, IP; Maranas, CD; Floudas, CA, αBB: A global optimization method for general constrained nonconvex problems, J. Global Optim., 7, 4, 337-363 (1995) · Zbl 0846.90087
[42] Tawarmalani, M.; Sahinidis, NV, Global optimization of mixed-integer nonlinear programs: a theoretical and computational study, Math. Program., 99, 3, 563-591 (2004) · Zbl 1062.90041
[43] Azar, M.G., Dyer, E.L., K, K.P., #246, rding: Convex relaxation regression: black-box optimization of smooth functions by learning their convex envelopes. Paper presented at the Proceedings of the Thirty-Second Conference on Uncertainty in Artificial Intelligence, Jersey City, New Jersey, USA
[44] Xu, WL; Nelson, BL, Empirical stochastic branch-and-bound for optimization via simulation, IIE Trans., 45, 7, 685-698 (2013)
[45] Sergeyev, YD; Kvasov, DE, A deterministic global optimization using smooth diagonal auxiliary functions, Commun. Nonlinear Sci. Numer. Simul., 21, 1, 99-111 (2015) · Zbl 1329.90112
[46] McCormick, GP, Computability of global solutions to factorable nonconvex programs: part I—convex underestimating problems, Math. Program., 10, 1, 147-175 (1976) · Zbl 0349.90100
[47] Duran, MA; Grossmann, IE, An outer-approximation algorithm for a class of mixed-integer nonlinear programs, Math. Program., 36, 3, 307-339 (1986) · Zbl 0619.90052
[48] Vapnik, V., The Nature of Statistical Learning Theory (1999), New York: Springer, New York · Zbl 0928.68093
[49] Smola, AJ; Schölkopf, B., A tutorial on support vector regression, Stat. Comput., 14, 3, 199-222 (2004)
[50] Oliphant, T.E.: A Guid to Numpy. Trelgol Publishing, USA (2006)
[51] Walt, SVD; Colbert, SC; Varoquaux, G., The Numpy array: a structure for efficient numerical computation, Comput. Sci. Eng., 13, 2, 22-30 (2011)
[52] Hart, WE; Laird, C.; Watson, J-P; Woodruff, DL, Pyomo-Optimization Modeling in Python (2012), Springer · Zbl 1233.90002
[53] Baudin, M., Christopoulou, M., Collette, Y., Martinez, J.-M.: pyDOE: The experimental design package for python. In
[54] Pedregosa, F., Ga, #235, Varoquaux, l., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., #201, Duchesnay, d.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825-2830 (2011) · Zbl 1280.68189
[55] Zhai, J.; Boukouvala, F., Nonlinear variable selection algorithms for surrogate modeling, AIChE J. (2019)
[56] Guzman, Y.A.: Theoretical Advances in Robust Optimization, Feature Selection, and Biomarker Discovery. Academic dissertations (Ph.D.), Princeton University (2016)
[57] Bound-constrained programs. http://www.minlp.com/nlp-and-minlp-test-problems. 2019
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.