×

Quasi-random sampling for multivariate distributions via generative neural networks. (English) Zbl 07499908

Summary: Generative moment matching networks (GMMNs) are introduced for generating approximate quasi-random samples from multivariate models with any underlying copula to compute estimates with variance reduction. So far, quasi-random sampling for multivariate distributions required a careful design, exploiting specific properties (such as conditional distributions) of the implied parametric copula or the underlying quasi-Monte Carlo (QMC) point set, and was only tractable for a small number of models. Using GMMNs allows one to construct approximate quasi-random samples for a much larger variety of multivariate distributions without such restrictions, including empirical ones from real data with dependence structures not well captured by parametric copulas. Once trained on pseudo-random samples from a parametric model or on real data, these neural networks only require a multivariate standard uniform randomized QMC point set as input and are thus fast in estimating expectations of interest under dependence with variance reduction. Numerical examples are considered to demonstrate the approach, including applications inspired by risk management practice.

MSC:

62-XX Statistics
PDFBibTeX XMLCite
Full Text: DOI arXiv

References:

[1] Aistleitner, C.; Dick, J., “Functions of Bounded Variation, Signed Measures, and a General Koksma-Hlawka Inequality,”, Acta Arithmetica, 167, 143-171 (2015) · Zbl 1326.11038
[2] Arjovsky, M.; Chintala, S.; Bottou, L., “Wasserstein Generative Adversarial Networks,”, International Conference on Machine Learning, 214-223 (2017)
[3] Arora, R.; Basu, A.; Mianjy, P.; Mukherjee, A., “Understanding Deep Neural Networks With Rectified Linear Units,”, arXiv no. 1611.01491 (2016)
[4] Arora, S.; Risteski, A.; Zhang, Y., “Do GANs Learn the Distribution? Some Theory and Empirics,”, in International Conference on Learning Representations, ICLR (2018)
[5] Cambou, M.; Hofert, M.; Lemieux, C., “Quasi-Random Numbers for Copula Models, Statistics and Computing, 27, 1307-1329 (2017) · Zbl 1505.62085 · doi:10.1007/s11222-016-9688-4
[6] Constantine, G.; Savits, T., “A Multivariate Faa di Bruno Formula With Applications,”, Transactions of the American Mathematical Society, 348, 503-520 (1996) · Zbl 0846.05003 · doi:10.1090/S0002-9947-96-01501-2
[7] Cranley, R.; Patterson, T. N. L., “Randomization of Number Theoretic Methods for Multiple Integration, SIAM Journal on Numerical Analysis, 13, 904-914 (1976) · Zbl 0354.65016 · doi:10.1137/0713071
[8] Cybenko, G., “Approximation by Superpositions of a Sigmoidal Function, Mathematics of Control, Signals and Systems, 2, 303-314 (1989) · Zbl 0679.94019 · doi:10.1007/BF02551274
[9] Dziugaite, G. K.; Roy, D. M.; Ghahramani, Z., “Training Generative Neural Networks via Maximum Mean Discrepancy Optimization,”, Proceedings of the Thirty-First Conference on Uncertainty in Artificial Intelligence, 258-267 (2015)
[10] Embrechts, P.; Lindskog, F.; McNeil, A. J.; Rachev, S., Handbook of Heavy Tailed Distributions in Finance, Modelling Dependence With Copulas and Applications to Risk Management, 329-384 (2003), Amsterdam: Elsevier, Amsterdam
[11] Fang, K.-T.; Wang, Y., Number-Theoretic Methods in Statistics (1994), Boca Raton, FL: Chapman & Hall, Boca Raton, FL · Zbl 0925.65263
[12] Fischer, M.; Köck, C.; Schlüter, S.; Weigert, F., “An Empirical Analysis of Multivariate Copula Models, Quantitative Finance, 9, 839-854 (2009) · Zbl 1180.91314 · doi:10.1080/14697680802595650
[13] Genest, C.; Ghoudi, K.; Rivest, L.-P., “A Semiparametric Estimation Procedure of Dependence Parameters in Multivariate Families of Distributions, Biometrika, 82, 543-552 (1995) · Zbl 0831.62030 · doi:10.1093/biomet/82.3.543
[14] Genest, C.; Rémillard, B.; Beaudoin, D., “Goodness-of-Fit Tests for Copulas: A Review and a Power Study, Insurance: Mathematics and Economics, 44, 199-213 (2009) · Zbl 1161.91416 · doi:10.1016/j.insmatheco.2007.10.005
[15] Glorot, X.; Bengio, Y., “Understanding the Difficulty of Training Deep Feedforward Neural Networks,”, Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, 249-256 (2010)
[16] Goodfellow, I.; Bengio, Y.; Courville, A., Deep learning, 1 (2016), Cambridge, MA: MIT Press, Cambridge, MA · Zbl 1373.68009
[17] Gretton, A.; Borgwardt, K.; Rasch, M. J.; Schölkopf, B.; Smola, A. J., Advances in Neural Information Processing Systems, “A Kernel Method for the Two-Sample-Problem,”, 513-520 (2007)
[18] Gretton, A.; Borgwardt, K.; Rasch, M. J.; Schölkopf, B.; Smola, A. J., “A Kernel Two-Sample Test, Journal of Machine Learning Research, 13, 723-773 (2012) · Zbl 1283.62095
[19] Hlawka, E., “Über die Diskrepanz mehrdimensionaler Folgen mod. 1, Mathematische Zeitschrift, 77, 273-284 (1961) · Zbl 0112.27803 · doi:10.1007/BF01180179
[20] Hlawka, E.; Mück, R., “Über eine Transformation von gleichverteilten Folgen II, Computing, 9, 127-138 (1972) · Zbl 0245.10039 · doi:10.1007/BF02236962
[21] Hofert, M., “Sampling Nested Archimedean Copulas With Applications to CDO Pricing,” (2010)
[22] Hofert, M., “A Stochastic Representation and Sampling Algorithm for Nested Archimedean Copulas, Journal of Statistical Computation and Simulation, 82, 1239-1255 (2012) · Zbl 1271.60026
[23] Hofert, M.; Kojadinovic, I.; Mächler, M.; Yan, J. (2018)
[24] Hofert, M.; Oldford, R. W., “Visualizing Dependence in High-Dimensional Data: An Application to S&P 500 Constituent Data,”, Econometrics and Statistics, 8, 161-183 (2018)
[25] Hörmann, W.; Leydold, J. (2005)
[26] Hornik, K., “Approximation Capabilities of Multilayer Feedforward Networks, Neural Networks, 4, 251-257 (1991) · doi:10.1016/0893-6080(91)90009-T
[27] Joe, H., Dependence Modeling With Copulas (2014), Boca Raton, FL: Chapman and Hall/CRC, Boca Raton, FL · Zbl 1346.62001
[28] Jondeau, E.; Rockinger, M., “The Copula-GARCH Model of Conditional Dependencies: An International Stock Market Application, Journal of International Money and Finance, 25, 827-853 (2006) · doi:10.1016/j.jimonfin.2006.04.007
[29] Kingma, D. P.; Ba, J., “Adam: A Method for Stochastic Optimization,”, arXiv no. 1412.6980 (2014)
[30] Klambauer, G.; Unterthiner, T.; Mayr, A.; Hochreiter, S., Self-Normalizing Neural Networks, Advances in Neural Information Processing Systems, 971-980 (2017)
[31] L’Ecuyer, P., “Randomized Quasi-Monte Carlo: An Introduction for Practitioners,”, International Conference on Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing, 29-52 (2016) · Zbl 1407.65013
[32] Lemieux, C., Monte Carlo and Quasi-Monte Carlo Sampling (2009), New York: Springer, New York · Zbl 1269.65001
[33] Li, Y.; Swersky, K.; Zemel, R., “Generative Moment Matching Networks,”, International Conference on Machine Learning, 1718-1727 (2015)
[34] McNeil, A. J., “Sampling Nested Archimedean Copulas, Journal of Statistical Computation and Simulation, 78, 567-581 (2008) · Zbl 1221.00061 · doi:10.1080/00949650701255834
[35] McNeil, A. J.; Frey, R.; Embrechts, P., Quantitative Risk Management: Concepts, Techniques, Tools (2015), Princeton, NJ: Princeton University Press, Princeton, NJ · Zbl 1337.91003
[36] Montufar, G. F.; Pascanu, R.; Cho, K.; Bengio, Y., Advances in Neural Information Processing Systems, “On the Number of Linear Regions of Deep Neural Networks,”, 2924-2932 (2014)
[37] Nelsen, R. B., An Introduction to Copulas (2006), New York: Springer-Verlag, New York · Zbl 1152.62030
[38] Niederreiter, H., Random Number Generation and Quasi-Monte Carlo Methods, 63 (1992), Philadelphia, PA: SIAM, Philadelphia, PA · Zbl 0761.65002
[39] Nielsen, M. A. (2015)
[40] Owen, A. B.; Niederreiter, H.; Shiue, P. J. S., Monte Carlo and Quasi-Monte Carlo Methods in Scientific Computing, Randomly Permuted (t, m, s)-Nets and (t, s)-Sequences,”, 299-317 (1995), New York: Springer, New York · Zbl 0831.65024
[41] Owen, A. B., “Monte Carlo Variance of Scrambled Net Quadrature, SIAM Journal on Numerical Analysis, 34, 1884-1910 (1997) · Zbl 0890.65023
[42] Owen, A. B., “Variance and Discrepancy With Alternative Scramblings, ACM Transactions of Modeling and Computer Simulation, 13, 1-16 (2003)
[43] Owen, A. B., “Local Antithetic Sampling With Scrambled Nets, The Annals of Statistics, 36, 2319-2343 (2008) · Zbl 1157.65006
[44] Pascanu, R.; Mikolov, T.; Bengio, Y., “On the Difficulty of Training Recurrent Neural Networks,”, International Conference on Machine Learning, 1310-1318 (2013)
[45] Patton, A. J., “Modelling Asymmetric Exchange Rate Dependence, International Economic Review, 47, 527-556 (2006) · doi:10.1111/j.1468-2354.2006.00387.x
[46] Radović, I.; Sobol, I.; Tichy, R., “Quasi-Monte Carlo Methods for Numerical Integration: Comparison of Different Low Discrepancy Sequences, Monte Carlo Methods and Applications, 2, 1-14 (1996) · Zbl 0851.65015 · doi:10.1515/mcma.1996.2.1.1
[47] Rémillard, B.; Scaillet, O., “Testing for Equality Between Two Copulas, Journal of Multivariate Analysis, 100, 377-386 (2009) · Zbl 1157.62401 · doi:10.1016/j.jmva.2008.05.004
[48] Rosenblatt, M., “Remarks on a Multivariate Transformation, The Annals of Mathematical Statistics, 23, 470-472 (1952) · Zbl 0047.13104 · doi:10.1214/aoms/1177729394
[49] Sobol’, I. M., “On the Distribution of Points in a Cube and the Approximate Evaluation of Integrals, Zhurnal Vychislitel’noi Matematiki i Matematicheskoi Fiziki, 7, 784-802 (1967) · Zbl 0185.41103
[50] Tolstikhin, I. O.; Gelly, S.; Bousquet, O.; Simon-Gabriel, C.-J.; Schölkopf, B., Advances in Neural Information Processing Systems, “Adagan: Boosting Generative Models,”, 5424-5433 (2017)
[51] Zhu, H.; Dick, J., “Discrepancy Bounds for Deterministic Acceptance-Rejection Samplers, Electronic Journal of Statistics, 8, 678-707 (2014) · Zbl 1348.60113 · doi:10.1214/14-EJS898
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.