# zbMATH — the first resource for mathematics

Why simple quadrature is just as good as Monte Carlo. (English) Zbl 07187402
Summary: We motive and calculate Newton-Cotes quadrature integration variance and compare it directly with Monte Carlo (MC) integration variance. We find an equivalence between deterministic quadrature sampling and random MC sampling by noting that MC random sampling is statistically indistinguishable from a method that uses deterministic sampling on a randomly shuffled (permuted) function. We use this statistical equivalence to regularize the form of permissible Bayesian quadrature integration priors such that they are guaranteed to be objectively comparable with MC. This leads to the proof that simple quadrature methods have expected variances that are less than or equal to their corresponding theoretical MC integration variances. Separately, using Bayesian probability theory, we find that the theoretical standard deviations of the unbiased errors of simple Newton-Cotes composite quadrature integrations improve over their worst case errors by an extra dimension independent factor $$\propto N^{-\frac{1}{2}}$$. This dimension independent factor is validated in our simulations.
##### MSC:
 65C05 Monte Carlo methods 65D32 Numerical quadrature and cubature formulas 68W20 Randomized algorithms 68W25 Approximation algorithms
DAKOTA; Stan
Full Text:
##### References:
 [1] B. Adams, L. Bauman, W. Bohnhoff, K. Dalbey, M. Ebeida, J. Eddy, M. Eldred, P. Hough, K. Hu, J. Jakeman, J. Stephens, L. Swiler, D. Vigil and T. Wildey, Dakota, A multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis: Version 6.3 User’s Manual, Sandia Labs Technical Report SAND2014-4633, New Mexico, 2015. [2] F. X. Briol, C. Oates, M. Girolami and M. A. Osborne, Frank-Wolfe Bayesian quadrature: Probabilistic integration with theoretical guarantees, Adv. Neural Inform. Process. Syst. 28 (2015), 1162-1170. [3] B. Carpenter, A. Gelman, M. D. Hoffman, D. Lee, B. Goodrich, M. Betancourt, M. Brubaker, J. Guo, P. Li and A. Riddell, Stan: A probabilistic programming language, J. Statist. Softw. 76 (2017), 1-31. [4] A. Caticha and A. Giffin, Updating probabilities, AIP Conf. Proc. 872 (2006), 31-42. [5] W. Gilks and P. Wild, Adaptive rejection sampling for Gibbs sampling, J. R. Stat. Soc. Ser. C. Appl. Stat. 41 (1992), 337-348. · Zbl 0825.62407 [6] O. P. Maître and O. M. Knio, Spectral Methods for Uncertainty Quantification, Springer, New York, 2010. · Zbl 1193.76003 [7] N. Metropolis, A. W. Rosenbluth, M. N. Rosenbluth, A. H. Teller and E. Teller, Equation of state calculations by fast computing machines, J. Chem. Phys. 21 (1953), 1087-1092. · Zbl 1431.65006 [8] R. M. Neal, Slice sampling, Ann. Statist. 31 (2003), 705-741. · Zbl 1051.65007 [9] A. O’Hagan, Monte Carlo is fundamentally unsound, J. R. Stat. Soc. Ser. D. Statist. 2/3 (1987), 247-249. [10] A. O’Hagan, Bayes-Hermite quadrature, J. Statist. Plann. Inference 3 (1991), 245-260. · Zbl 0829.65024 [11] K. Vanslette, The inferential design of entropy and its application to quantum measurements, Ph.D. thesis, University at Albany SUNY, 2018.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.