zbMATH — the first resource for mathematics

An experimental methodology for response surface optimization methods. (English) Zbl 1259.90102
Summary: Response surface methods, and global optimization techniques in general, are typically evaluated using a small number of standard synthetic test problems, in the hope that these are a good surrogate for real-world problems. We introduce a new, more rigorous methodology for evaluating global optimization techniques that is based on generating thousands of test functions and then evaluating algorithm performance on each one. The test functions are generated by sampling from a Gaussian process, which allows us to create a set of test functions that are interesting and diverse. They will have different numbers of modes, different maxima, etc., and yet they will be similar to each other in overall structure and level of difficulty. This approach allows for a much richer empirical evaluation of methods that is capable of revealing insights that would not be gained using a small set of test functions. To facilitate the development of large empirical studies for evaluating response surface methods, we introduce a dimension-independent measure of average test problem difficulty, and we introduce acquisition criteria that are invariant to vertical shifting and scaling of the objective function. We also use our experimental methodology to conduct a large empirical study of response surface methods. We investigate the influence of three properties-parameter estimation, exploration level, and gradient information-on the performance of response surface methods.

90C26 Nonconvex programming, global optimization
Matlab; EGO; SPACE
Full Text: DOI
[1] Adler R.J., Taylor J.E.: Random Fields and Geometry. Springer, Berlin (2007) · Zbl 1149.60003
[2] Audet, C., Dennis, J.J.E., Moore, D., Booker, A., Frank, P.: A surrogate-model-based method for constrained optimization. In: Proceedings of the 8th AIAA/NASA/USAF/ISSMO Symposium on Multidisciplinary Analysis and Optimization. Paper No. AIAA-2000-4891 (2000)
[3] Bakker T.M.D.: Design Optimization With Kriging Models. Delft University Press, Delft (2000)
[4] Bardenet, R., Kegl, B.: Surrogating the surrogate: accelerating Gaussian-process-based global optimization with a mixture cross-entropy algorithm. In: Proceedings of the 27th Annual International Conference on Machine Learning (2010)
[5] Boyle, P.: Gaussian processes for regression and optimisation. PhD thesis, Victoria University of Wellington (2006)
[6] Chernova, S., Veloso, M.: An evolutionary approach to gait learning for four-legged robots. In: Intelligent Robots and Systems (2004)
[7] Cox, D.D., John, S. SDO: a statistical method for global optimization. In: Multidisciplinary design optimization (Hampton, VA, 1995), pp. 315–329. SIAM, Philadelphia, PA (1997)
[8] Csató L., Opper M.: Sparse representation for Gaussian process models. In: Leen, T.K., Dietterich, T.G., Tresp, V. (eds) Neural Information Processing Systems, vol. 13, The MIT Press, Cambridge, MA (2001) · Zbl 1001.68758
[9] Deb K., Thiele L., Laumanns M., Zitzler E.: Scalable test problems for evolutionary multiobjective optimization. In: Jain, L., Wu, X., Abraham, A., Goldberg, R. (eds) Evolutionary Multiobjective Optimization., pp. 105–145. Springer, Berlin (2005) · Zbl 1078.90567
[10] Dixon, L., Szegö, G. (eds.): Towards Global Optimisation, Chapter The global optimisation problem: an introduction, vol. 2, pp. 1–15. North-Holland, Amsterdam (1978)
[11] Elder J.F.: Global $${\(\backslash\)mathbb{R}\^d}$$ optimization when probes are expensive: the GROPE algorithm. In: Proceedings IEEE International Conference on Systems, Man, and Cybernetics, Chicago, Illinois (1992)
[12] Floudas C.A., Gounaris C.E.: A review of recent advances in global optimization. J. Glob. Optim. 45, 3–38 (2009) · Zbl 1180.90245
[13] Gutmann H.-M.: A radial basis function method for global optimization. J. Glob. Optim. 19, 201–227 (2001) · Zbl 0972.90055
[14] Huang D., Allen T.T., Notz W.I., Zeng N.: Global optimization of stochastic black-box systems via sequential kriging meta-models. J. Glob. Optim. 34, 441–466 (2006) · Zbl 1098.90097
[15] Jones D., Perttunen C., Stuckman B.: Lipschitzian optimization without the lipschitz constant. Optim. Theory Appl. 79(1), 157–181 (1993) · Zbl 0796.49032
[16] Jones D.R.: A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21, 345–383 (2001) · Zbl 1172.90492
[17] Jones D.R., Schonlau M., Welch W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13, 455–492 (1998) · Zbl 0917.90270
[18] Khare, V., Yao, X., Deb, K.: Performance scaling of multi-objective evolutionary algorithms. In: Evolutionary Multi-Criterion Optimization, volume 2632/2003 of Lecture Notes in Computing Science. Springer, Berlin (2003) · Zbl 1036.90541
[19] Kushner H.: A new method of locating the maximum of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86, 97–106 (1964)
[20] Lawrence N., Seeger M., Herbrich R.: Fast sparse gaussian process methods: the informative vector machine. In: Becker, S.T.S., Obermayer, K. (eds) Advances in Neural Information Processing Systems, vol. 15, pp. 609–616. MIT Press, Cambridge, MA (2003)
[21] Leary S.J., Bhaskar A., Keane A.J.: A derivative based surrogate model for approximating and optimizing the output of an expensive computer simulation. J. Glob. Optim. 30, 39–58 (2004) · Zbl 1136.90302
[22] Li R., Sudjianto A.: Analysis of computer experiments using penalized likelihood in Gaussian kriging models. Technometrics 47, 111–120 (2005)
[23] Lizotte, D.J.: Practical Bayesian optimization. PhD thesis, University of Alberta, Alberta (2008)
[24] Lizotte, D.J., Wang, T., Bowling, M., Schuurmans, D.: Automatic gait optimization with Gaussian process regression. In: Proceedings of the International Joint Conference on Artificial Intelligence (2007)
[25] MATLAB: Version 7.10.0 (R2010a). The MathWorks Inc., Natick, MA (2010)
[26] Mockus J.: Bayesian Approach to Global Optimization. Kluwer Academic Publishers, Dordrecht (1989) · Zbl 0693.49001
[27] Perttunen, C. D.: A nonparametric global optimization method using the rank transformation. In: Proceedings of the IEEE Conference on Systems, Man, and Cybernetics, vol. 1, pp. 888–893 (2007)
[28] Rasmussen C.E.: Gaussian processes in machine learning. In: Bousquet, O., von Luxburg, U., Rätsch, G. (eds) Advanced Lectures in Machine Learning: ML Summer Schools 2003, Springer, Berlin (2004) · Zbl 1120.68436
[29] Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 9.4 Derivative Observations. MIT Press, Cambridge (2006a)
[30] Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 5 Model Selection and Adaptation of Hyperparameters. MIT Press, Cambridge (2006b)
[31] Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 4.2 Examples of Covariance Functions. MIT Press, Cambridge (2006c)
[32] Rasmussen C.E., Williams C.K.I.: Gaussian Processes for Machine Learning, chapter 2 Regression. MIT Press, Cambridge (2006d) · Zbl 1177.68165
[33] Regis C.A.S.R.G.: Improved strategies for radial basis function methods for global optimization. J. Glob. Optim. 37(1), 113–135 (2007) · Zbl 1149.90120
[34] Sacks J., Welch W.J., Mitchell T.J., Wynn H.P.: Design and analysis of computer experiments. Stat. Sci. 4, 409–435 (1989) · Zbl 0955.62619
[35] Santner T.J., Williams B.J., Notz W.I.: The Design and Analysis of Computer Experiments. Springer, Berlin (2003) · Zbl 1041.62068
[36] Sasena, M.J.: Flexibility and efficiency enhancements for constrained global design optimization with Kriging approximations. PhD thesis, University of Michigan, Ann Arbor (2002)
[37] Schonlau, M.: Computer experiments and global optimization. PhD thesis, University of Waterloo, Waterloo (1997)
[38] Shawe-Taylor J., Cristiani N.: Kernel methods for pattern analysis, chapter Properties of Kernels. MIT Press, Cambridge (2004)
[39] Srinivas, N., Krause, A., Kakade, S., Seeger, M.: Gaussian process optimization in the bandit setting: No regret and experimental design. In: Proceedings of the 27th Annual International Conference on Machine Learning (2010) · Zbl 1365.94131
[40] Stuckman, B.: A global search method for optimizing nonlinear systems. In: Proceedings of the IEEE Conference on Systems, Man, and Cybernetics, vol. 1, pp 965–977 (1989) · Zbl 0673.65034
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.