×

Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling. (English) Zbl 1407.62252

Summary: Multi-fidelity modelling enables accurate inference of quantities of interest by synergistically combining realizations of low-cost/low-fidelity models with a small set of high-fidelity observations. This is particularly effective when the low- and high-fidelity models exhibit strong correlations, and can lead to significant computational gains over approaches that solely rely on high-fidelity models. However, in many cases of practical interest, low-fidelity models can only be well correlated to their high-fidelity counterparts for a specific range of input parameters, and potentially return wrong trends and erroneous predictions if probed outside of their validity regime. Here we put forth a probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends. This introduces a new class of multi-fidelity information fusion algorithms that provide a fundamental extension to the existing linear autoregressive methodologies, while still maintaining the same algorithmic complexity and overall computational cost. The performance of the proposed methods is tested in several benchmark problems involving both synthetic and real multi-fidelity datasets from computational fluid dynamics simulations.

MSC:

62J02 General nonlinear regression
62M10 Time series, auto-correlation, regression, etc. in statistics (GARCH)
62H30 Classification and discrimination; cluster analysis (statistical aspects)
62P35 Applications of statistics to physics
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Peherstorfer B, Willcox K, Gunzburger M. (2016)Survey of multifidelity methods in uncertainty propagation, inference, and optimization. Technical Report TR-16-1. Aerospace Computational Design Laboratory, Department of Aeronautics and Astronautics, Massachusetts Institute of Technology, Cambridge, MA, USA.
[2] Forrester A, Sobester A, Keane A. (2008) Engineering design via surrogate modelling: a practical guide. Chichester, UK: John Wiley & Sons.
[3] Perdikaris P, Karniadakis GE. (2016) Model inversion via multi-fidelity Bayesian optimization: a new paradigm for parameter estimation in haemodynamics, and beyond. J. R. Soc. Interface 13, 20151107. (doi:10.1098/rsif.2015.1107) · doi:10.1098/rsif.2015.1107
[4] Perdikaris P, Venturi D, Karniadakis GE. (2016) Multifidelity information fusion algorithms for high-dimensional systems and massive data sets. SIAM J. Sci. Comput. 38, B521-B538. (doi:10.1137/15M1055164) · Zbl 1342.62110 · doi:10.1137/15M1055164
[5] Williams CK, Rasmussen CE. (2006) Gaussian processes for machine learning, vol. 2. Cambridge, MA: MIT Press.
[6] Kennedy MC, O’Hagan A. (2000) Predicting the output from a complex computer code when fast approximations are available. Biometrika 87, 1-13. (doi:10.1093/biomet/87.1.1) · Zbl 0974.62024 · doi:10.1093/biomet/87.1.1
[7] Babaee H, Perdikaris P, Chryssostomidis C, Karniadakis G. (2016) Multi-fidelity modeling of mixed convection based on experimental correlations and numerical simulations. J. Fluid Mech. 809, 895-917. (doi:10.1017/jfm.2016.718) · Zbl 1383.76438 · doi:10.1017/jfm.2016.718
[8] Le Gratiet L, Garnier J. (2014) Recursive co-kriging model for design of computer experiments with multiple levels of fidelity. Int. J. Uncertainty Quant. 4, 365-386. (doi:10.1615/Int.J.UncertaintyQuantification.2014006914) · doi:10.1615/Int.J.UncertaintyQuantification.2014006914
[9] O’Hagan A. (1998) A Markov property for covariance structures. Stat. Res. Rep. 98, 13.
[10] Damianou AC, Lawrence ND. (2013)Deep Gaussian processes. In Proc. 16th Int. Conf. on Artificial Intelligence and Statistics (AISTATS), Scottsdale, AZ, 29 April-1 May 2013, pp. 207-215.
[11] Damianou A. (2015)Deep Gaussian processes and variational propagation of uncertainty. PhD thesis, University of Sheffield, Sheffield, UK.
[12] Mattos CLC, Dai Z, Damianou A, Forth J, Barreto GA, Lawrence ND. (2015)Recurrent Gaussian processes. (http://arxiv.org/abs/1511.06644)
[13] Bui TD, Hernández-Lobato JM, Li Y, Hernández-Lobato D, Turner RE. (2015)Training deep Gaussian processes using stochastic expectation propagation and probabilistic backpropagation. (http://arxiv.org/abs/1511.03405)
[14] Le Gratiet L(2013). Multi-fidelity Gaussian process regression for computer experiments. Thesis, Université Paris-Diderot - Paris VII, France.
[15] Snelson E, Ghahramani Z. (2005)Sparse Gaussian processes using pseudo-inputs. In Proc. of the 18th Int. Conf. on Neural Information Processing Systems (NIPS’05), Vancouver, British Columbia, Canada, 5-8 December 2005, pp. 1257-1264. Cambridge, MA: MIT Press.
[16] Hensman J, Fusi N, Lawrence ND. (2013)Gaussian processes for big data. (http://arxiv.org/abs/1309.6835)
[17] Liu DC, Nocedal J. (1989) On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503-528. (doi:10.1007/BF01589116) · Zbl 0696.90048 · doi:10.1007/BF01589116
[18] Girard A, Rasmussen CE, Quinonero-Candela J, Murray-Smith R. (2003)Gaussian process priors with uncertain inputs. Application to multiple-step ahead time series forecasting. In Advances in neural information processing systems 15 (eds S Becker, S Thrun, K Obermayer), pp. 545-552. Cambridge, MA: MIT Press. See http://papers.nips.cc/paper/2313-gaussian-process-priors-with-uncertain-inputs-application-to-multiple-step-ahead-time-series-forecasting.pdf.
[19] GPy. (2002)GPy: A Gaussian process framework in python. See http://github.com/SheffieldML/GPy.
[20] Surjanovic S, Bingham D. Virtual library of simulation experiments. Test functions and datasets. See http://www.sfu.ca/ssurjano.
[21] MacKay DJ. (1992) Information-based objective functions for active data selection. Neural Comput. 4, 590-604. (doi:10.1162/neco.1992.4.4.590) · doi:10.1162/neco.1992.4.4.590
[22] Cohn DA, Ghahramani Z, Jordan MI. (1996) Active learning with statistical models. J. Artif. Intell. Res. 4, 129-145. · Zbl 0900.68366
[23] Hatton H, James A, Swire D. (1970) Combined forced and natural convection with low-speed air flow over horizontal cylinders. J. Fluid Mech. 42, 17-31. (doi:10.1017/S0022112070001040) · doi:10.1017/S0022112070001040
[24] Damianou A, Lawrence ND(2015)Semi-described and semi-supervised learning with Gaussian processes. (http://arxiv.org/abs/1509.01168)
[25] Titsias MK, Lázaro-gredilla M(2011)Variational heteroscedastic Gaussian process regression. In Proc. of the 28th Int. Conf. on Machine Learning (ICML-11), Bellevue, WA, 28 June-2 July 2011, pp. 841-848.
[26] Snelson E, Rasmussen CE, Ghahramani Z.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.