×

Fast approximate Bayesian computation for estimating parameters in differential equations. (English) Zbl 1505.62158

Summary: Approximate Bayesian computation (ABC) using a sequential Monte Carlo method provides a comprehensive platform for parameter estimation, model selection and sensitivity analysis in differential equations. However, this method, like other Monte Carlo methods, incurs a significant computational cost as it requires explicit numerical integration of differential equations to carry out inference. In this paper we propose a novel method for circumventing the requirement of explicit integration by using derivatives of Gaussian processes to smooth the observations from which parameters are estimated. We evaluate our methods using synthetic data generated from model biological systems described by ordinary and delay differential equations. Upon comparing the performance of our method to existing ABC techniques, we demonstrate that it produces comparably reliable parameter estimates at a significantly reduced execution time.

MSC:

62-08 Computational methods for problems pertaining to statistics
62F15 Bayesian inference
62P10 Applications of statistics to biology and medical sciences; meta analysis
65C05 Monte Carlo methods
68T05 Learning and adaptive systems in artificial intelligence
PDFBibTeX XMLCite
Full Text: DOI arXiv

References:

[1] Beaumont, M.A., Cornuet, J.M., Marin, J.M., Robert, C.P.: Adaptive approximate Bayesian computation. Biometrika 96(4), 983-990 (2009) · Zbl 1437.62393 · doi:10.1093/biomet/asp052
[2] Calderhead, B., Girolami, M., Lawrence, N.D.: Accelerating Bayesian inference over nonlinear differential equations with Gaussian processes. In: Proceedings of Advances in Neural Information Processing Systems 21, pp 217-224 (2008)
[3] Del Moral, P., Doucet, A., Jasra, A.: Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B 68(3), 411-436 (2006) · Zbl 1105.62034 · doi:10.1111/j.1467-9868.2006.00553.x
[4] Del Moral, P., Doucet, A., Jasra, A.: An adaptive sequential Monte Carlo method for approximate Bayesian computation. Stat. Comput. 22(5), 1009-1020 (2012) · Zbl 1252.65025 · doi:10.1007/s11222-011-9271-y
[5] Didelot, X., Everitt, R.G., Johansen, A.M., Lawson, D.J.: Likelihood-free estimation of model evidence. Bayesian Anal. 6(1), 49-76 (2011) · Zbl 1330.62118 · doi:10.1214/11-BA602
[6] Dondelinger, F., Filippone, M., Rogers, S., Husmeier, D.: ODE parameter inference using adaptive gradient matching with Gaussian processes. In: Proceedings of 16th International Conference on Artificial Intelligence and Statistics, pp 216-228 (2013)
[7] Drovandi, C.C., Pettitt, A.N.: Estimation of parameters for macroparasite population evolution using approximate Bayesian computation. Biometrics 67(1), 225-233 (2011) · Zbl 1217.62128 · doi:10.1111/j.1541-0420.2010.01410.x
[8] Filippi, S., Barnes, C.P., Cornebise, J., Stumpf, M.P.: On optimality of kernels for approximate Bayesian computation using sequential Monte Carlo. Stat. Appl. Genet. Mol. Biol. 12(1), 87-107 (2013)
[9] MacKay, D.J.C.: Introduction to Gaussian processes. NATO ASI Ser. F 168, 133-166 (1998) · Zbl 0936.68081
[10] Monk, N.A.M.: Oscillatory expression of Hes1, p53, and NF-\[ \kappa\] κB driven by transcriptional time delays. Curr. Biol. 13, 1409-1413 (2003) · doi:10.1016/S0960-9822(03)00494-9
[11] Murray, J.D.: Mathematical Biology: I. An Introduction. Springer, New York (2002) · Zbl 1006.92001
[12] Neal, R.M.: Regression and classification using Gaussian process priors. Bayesian Stat. 6, 475-501 (1998) · Zbl 0974.62072
[13] O’Hagan, A., Kingman, J.F.C.: Curve fitting and optimal design for prediction. J. R. Stat. Soc. Ser. B 40, 1-42 (1978) · Zbl 0374.62070
[14] Ramsay, J.O., Hooker, G., Campbell, D., Cao, J.: Parameter estimation for differential equations: a generalized smoothing approach. J. R. Stat. Soc. Ser. B 69(5), 741-796 (2007) · Zbl 07555374 · doi:10.1111/j.1467-9868.2007.00610.x
[15] Rasmussen, C.E., Nickisch, H.: Gaussian processes for machine learning (GPML) toolbox. J. Mach. Learn. Res. 9999, 3011-3015 (2010) · Zbl 1242.68242
[16] Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006) · Zbl 1177.68165
[17] Ruttor, A., Batz, P., Opper, M.: Approximate Gaussian process inference for the drift of stochastic differential equations. In: Proceedings of Advances in Neural Information Processing Systems 26, pp 2040-2048 (2013)
[18] Sisson, S.A., Fan, Y., Tanaka, M.M.: Sequential Monte Carlo without likelihoods. Proc. Natl Acad. Sci. U.S.A. 104(6), 1760-1765 (2007) · Zbl 1160.65005 · doi:10.1073/pnas.0607208104
[19] Sisson, S.A., Fan, Y., Tanaka, M.M.: Correction for Sisson sequential Monte Carlo without likelihoods. Proc. Natl Acad. Sci. U.S.A. 106(39), 16,889 (2009)
[20] Solak, E., Murray-Smith, R., Leithead, W., Rasmussen, C., Leith, D.: Derivative observations in Gaussian process models of dynamic systems. In: Proceedings of Advances in Neural Information Processing Systems 15, pp 1033-1040 (2002)
[21] Toni, T., Welch, D., Strelkowa, N., Ipsen, A., Stumpf, M.P.: Approximate Bayesian computation scheme for parameter inference and model selection in dynamical systems. J. R. Soc. Interface 6(31), 187-202 (2009) · doi:10.1098/rsif.2008.0172
[22] Varah, J.M.: A spline least squares method for numerical parameter estimation in differential equations. SIAM J. Sci. Stat. Comput. 3(1), 28-46 (1982) · Zbl 0481.65050 · doi:10.1137/0903003
[23] Vyshemirsky, V., Girolami, M.A.: Bayesian ranking of biochemical system models. Bioinformatics 24(6), 833-839 (2008) · doi:10.1093/bioinformatics/btm607
[24] Wang, Y., Barber, D.: Gaussian processes for Bayesian estimation in ordinary differential equations. In: Proceedings of 31st International Conference on Machine Learning, pp 1485-1493 (2014)
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.