×

Simultaneous inference for pairwise graphical models with generalized score matching. (English) Zbl 1502.62073

Summary: Probabilistic graphical models provide a flexible yet parsimonious framework for modeling dependencies among nodes in networks. There is a vast literature on parameter estimation and consistent model selection for graphical models. However, in many of the applications, scientists are also interested in quantifying the uncertainty associated with the estimated parameters and selected models, which current literature has not addressed thoroughly. In this paper, we propose a novel estimator for statistical inference on edge parameters in pairwise graphical models based on generalized Hyvarinen scoring rule. Hyvarinen scoring rule is especially useful in cases where the normalizing constant cannot be obtained efficiently in a closed form, which is a common problem for graphical models, including Ising models and truncated Gaussian graphical models. Our estimator allows us to perform statistical inference for general graphical models whereas the existing works mostly focus on statistical inference for Gaussian graphical models where finding normalizing constant is computationally tractable. Under mild conditions that are typically assumed in the literature for consistent estimation, we prove that our proposed estimator is \(\sqrt{n} \)-consistent and asymptotically normal, which allows us to construct confidence intervals and build hypothesis tests for edge parameters. Moreover, we show how our proposed method can be applied to test hypotheses that involve a large number of model parameters simultaneously. We illustrate validity of our estimator through extensive simulation studies on a diverse collection of data-generating processes.

MSC:

62H22 Probabilistic graphical models
62A09 Graphical methods in statistics
62-08 Computational methods for problems pertaining to statistics
62R07 Statistical aspects of big data and data science

Software:

XBART; ROCKET
PDFBibTeX XMLCite
Full Text: arXiv Link

References:

[1] B. C. Arnold, E. Castillo, and J. M. Sarabia.Conditional specification of statistical models. Springer Series in Statistics. Springer-Verlag, New York, 1999. · Zbl 0932.62001
[2] R. R. Bahadur. A note on quantiles in large samples.Ann. Math. Statist., 37:577-580, 1966. · Zbl 0147.18805
[3] R. F. Barber and M. Kolar. Rocket: Robust confidence intervals via kendall’s tau for transelliptical graphical models.Ann. Statist., 46(6B):3422-3450, 2018. · Zbl 1410.62059
[4] A. Belloni and V. Chernozhukov. Least squares after model selection in high-dimensional sparse models.Bernoulli, 19(2):521-547, 2013. · Zbl 1456.62066
[5] A. Belloni, V. Chernozhukov, and C. B. Hansen. Inference on treatment effects after selection amongst high-dimensional controls.Rev. Econ. Stud., 81(2):608-650, 2013. · Zbl 1409.62142
[6] Z. I. Botev. The normal law under linear restrictions: simulation and estimation via minimax tilting.J. R. Stat. Soc. Ser. B. Stat. Methodol., 79(1):125-148, 2017. · Zbl 1414.62102
[7] T. T. Cai, W. Liu, and X. Luo. A constrained‘1minimization approach to sparse precision matrix estimation.J. Am. Stat. Assoc., 106(494):594-607, 2011. · Zbl 1232.62087
[8] J. Cao and C. Dowd. Estimation and inference for synthetic control methods with spillover effects.arXiv preprint arXiv:1902.07343, 2019.
[9] J. Cao and S. Lu.Synthetic control inference for staggered adoption: Estimating the dynamic effects of board gender diversity policies.arXiv preprint arXiv:1912.06320, 2019.
[10] J. Chang, Y. Qiu, Q. Yao, and T. Zou. Confidence regions for entries of a large precision matrix.Journal of Econometrics, 206(1):57-82, 2018. · Zbl 1398.62068
[11] S. Chen, D. M. Witten, and A. Shojaie. Selection and estimation for mixed graphical models.Biometrika, 102(1):47-64, 2015. · Zbl 1345.62081
[12] J. Cheng, E. Levina, and J. Zhu. High-dimensional mixed graphical models.ArXiv e-prints, arXiv:1304.2810, 2013,arXiv:1304.2810.
[13] V. Chernozhukov, D. Chetverikov, and K. Kato. Gaussian approximations and multiplier bootstrap for maxima of sums of high-dimensional random vectors.Ann. Stat., 41(6): 2786-2819, 2013. · Zbl 1292.62030
[14] P. Danaher, P. Wang, and D. M. Witten. The joint graphical lasso for inverse covariance estimation across multiple classes.J. R. Stat. Soc. B, 76(2):373-397, 2014. · Zbl 07555455
[15] A. d’Aspremont, O. Banerjee, and L. El Ghaoui. First-order methods for sparse covariance selection.SIAM J. Matrix Anal. Appl., 30(1):56-66, 2008. · Zbl 1156.90423
[16] A. P. Dawid. The geometry of proper scoring rules.Ann. Inst. Statist. Math., 59(1):77-93, 2007. · Zbl 1108.62009
[17] A. P. Dempster. Covariance selection.Biometrics, 28:157-175, 1972.
[18] R. Dezeure, P. B¨uhlmann, and C.-H. Zhang. High-dimensional simultaneous inference with the bootstrap.TEST, 26(4):685-719, 2017. · Zbl 06833591
[19] M. Drton and M. H. Maathuis. Structure learning in graphical modeling.Annual Review of Statistics and Its Application, 4(1):365-393, 2017.
[20] M. Drton and M. D. Perlman.Model selection for gaussian concentration graphs. Biometrika, 91(3):591-602, 2004. · Zbl 1108.62098
[21] J. Fan, Y. Feng, and Y. Wu. Network exploration via the adaptive lasso and scad penalties. Ann. Appl. Stat., 3(2):521-541, 2009. · Zbl 1166.62040
[22] J. Fan, H. Liu, Y. Ning, and H. Zou. High dimensional semiparametric latent graphical model for mixed data.J. R. Stat. Soc. Ser. B. Stat. Methodol., 79(2):405-421, 2017. · Zbl 1414.62179
[23] P. G. M. Forbes and S. L. Lauritzen. Linear estimating equations for exponential families with application to Gaussian linear concentration models.Linear Algebra Appl., 473: 261-283, 2015. · Zbl 1312.62068
[24] J. H. Friedman, T. J. Hastie, and R. J. Tibshirani. Sparse inverse covariance estimation with the graphical lasso.Biostatistics, 9(3):432-441, 2008. · Zbl 1143.62076
[25] A. Gelman and X.-L. Meng. A note on bivariate distributions that are conditionally normal. The American Statistician, 45(2):125-126, 1991.
[26] S. Geng, M. Kolar, and O. Koyejo. Joint nonparametric precision matrix estimation with confounding.CoRR, abs/1810.07147, 2018,arXiv:1810.07147.
[27] S. Geng, M. Yan, M. Kolar, and S. Koyejo. Partially linear additive Gaussian graphical models. In K. Chaudhuri and R. Salakhutdinov, editors,Proceedings of the 36th International Conference on Machine Learning, volume 97 ofProceedings of Machine Learning Research, pages 2180-2190, Long Beach, California, USA, 2019. PMLR.
[28] J. Guo, E. Levina, G. Michailidis, and J. Zhu. Joint estimation of multiple graphical models. Biometrika, 98(1):1-15, 2011a. · Zbl 1214.62058
[29] J. Guo, E. Levina, G. Michailidis, and J. Zhu. Asymptotic properties of the joint neighborhood selection method for estimating categorical markov networks. Technical report, University of Michigan, 2011b.
[30] P. R. Hahn, C. M. Carvalho, D. Puelz, J. He, et al. Regularization and confounding in linear regression for treatment effect estimation.Bayesian Analysis, 13(1):163-182, 2018. · Zbl 06873722
[31] P. R. Hahn, J. He, and H. F. Lopes. Efficient sampling for gaussian linear regression with arbitrary priors.Journal of Computational and Graphical Statistics, 28(1):142-154, 2019. · Zbl 07499018
[32] J. He and P. R. Hahn. Stochastic tree ensembles for regularized nonlinear regression.arXiv preprint arXiv:2002.03375, 2020.
[33] J. He, S. Yalov, and P. R. Hahn. Xbart: Accelerated bayesian additive regression trees. arXiv preprint arXiv:1810.02215, 2018.
[34] H. H¨ofling and R. J. Tibshirani. Estimation of sparse binary pairwise markov networks using pseudo-likelihoods.J. Mach. Learn. Res., 10:883-906, 2009. · Zbl 1245.62121
[35] A. Hyv¨arinen. Estimation of non-normalized statistical models by score matching.J. Mach. Learn. Res., 6:695-709, 2005. · Zbl 1222.62051
[36] A. Hyv¨arinen. Some extensions of score matching.Comput. Stat. Data Anal., 51(5):2499- 2512, 2007. · Zbl 1161.62326
[37] D. Inouye, P. Ravikumar, and I. Dhillon. Square root graphical models: Multivariate generalizations of univariate exponential families that permit positive dependencies. In M. F. Balcan and K. Q. Weinberger, editors,Proceedings of The 33rd International Conference on Machine Learning, volume 48 ofProceedings of Machine Learning Research, pages 2445-2453, New York, New York, USA, 2016. PMLR.
[38] J. Jankov´a and S. van de Geer. Confidence intervals for high-dimensional inverse covariance estimation.Electron. J. Stat., 9(1):1205-1229, 2015. · Zbl 1328.62458
[39] J. Jankov´a and S. van de Geer. Inference in high-dimensional graphical models. InHandbook of graphical models, Chapman & Hall/CRC Handb. Mod. Stat. Methods, pages 325-349. CRC Press, Boca Raton, FL, 2019. · Zbl 1442.62143
[40] J. Jankov´a and S. A. van de Geer. Honest confidence regions and optimality in highdimensional precision matrix estimation.TEST, 26(1):143-162, 2017. · Zbl 1368.62204
[41] A. Javanmard and A. Montanari. Confidence intervals and hypothesis testing for highdimensional regression.J. Mach. Learn. Res., 15(Oct):2869-2909, 2014. · Zbl 1319.62145
[42] B. Kim, S. Liu, and M. Kolar. Two-sample inference for high-dimensional markov networks. arXiv 1905.00466, 2019,arXiv:http://arxiv.org/abs/1905.00466v1.
[43] M. Kolar and E. P. Xing. On time varying undirected graphs. In G. J. Gordon, D. B. Dunson, and M. Dud´ık, editors,Proceedings of the Fourteenth International Conference on Artificial Intelligence and Statistics, AISTATS 2011, Fort Lauderdale, USA, April 11-13, 2011, volume 15 ofJMLR Proceedings, pages 407-415. JMLR.org, 2011.
[44] M. Kolar and E. P. Xing. Estimating networks with jumps.Electron. J. Stat., 6:2069-2106, 2012. · Zbl 1295.62032
[45] M. Kolar, A. P. Parikh, and E. P. Xing. On sparse nonparametric conditional covariance selection. In J. F¨urnkranz and T. Joachims, editors,Proceedings of the 27th International Conference on Machine Learning (ICML-10), June 21-24, 2010, Haifa, Israel, pages 559- 566. Omnipress, 2010a.
[46] M. Kolar, L. Song, A. Ahmed, and E. P. Xing. Estimating Time-varying networks.Ann. Appl. Stat., 4(1):94-123, 2010b. · Zbl 1189.62142
[47] M. Kolar, H. Liu, and E. Xing. Markov network estimation from multi-attribute data. In S. Dasgupta and D. McAllester, editors,Proceedings of the 30th International Conference on Machine Learning, volume 28 ofProceedings of Machine Learning Research, pages 73- 81, Atlanta, Georgia, USA, 2013. PMLR.
[48] M. Kolar, H. Liu, and E. P. Xing. Graph estimation from multi-attribute data.J. Mach. Learn. Res., 15(1):1713-1750, 2014. · Zbl 1319.62113
[49] C. Lam and J. Fan.Sparsistency and rates of convergence in large covariance matrix estimation.Ann. Stat., 37:4254-4278, 2009. · Zbl 1191.62101
[50] S. L. Lauritzen.Graphical Models, volume 17 ofOxford Statistical Science Series. The Clarendon Press Oxford University Press, New York, 1996. Oxford Science Publications. · Zbl 0907.62001
[51] J. D. Lee and T. J. Hastie. Learning the structure of mixed graphical models.J. Comput. Graph. Statist., 24(1):230-253, 2015.
[52] H. Leeb and B. M. P¨otscher. Can one estimate the unconditional distribution of postmodel-selection estimators?Econ. Theory, 24(02):338-376, 2007. · Zbl 1284.62152
[53] K. T. Li. Statistical inference for average treatment effects estimated by synthetic control methods.Journal of the American Statistical Association · Zbl 1453.62330
[54] K.-C. Li, A. Palotie, S. Yuan, D. Bronnikov, D. Chen, X. Wei, O.-W. Choi, J. Saarela, and L. Peltonen. Finding disease candidate genes by liquid association.Genome Biology, 8 (10):R205, 2007.
[55] L. Lin, M. Drton, and A. Shojaie. Estimation of high-dimensional graphical models using regularized score matching.Electron. J. Stat., 10(1):806-854, 2016. · Zbl 1336.62130
[56] H. Liu and L. Wang. TIGER: a tuning-insensitive approach for optimally estimating Gaussian graphical models.Electron. J. Stat., 11(1):241-294, 2017. · Zbl 1395.62007
[57] H. Liu, J. D. Lafferty, and L. A. Wasserman. The nonparanormal: Semiparametric estimation of high dimensional undirected graphs.J. Mach. Learn. Res., 10:2295-2328, 2009. · Zbl 1235.62035
[58] H. Liu, F. Han, M. Yuan, J. D. Lafferty, and L. A. Wasserman. High-dimensional semiparametric Gaussian copula graphical models.Ann. Stat., 40(4):2293-2326, 2012a. · Zbl 1297.62073
[59] H. Liu, F. Han, and C. Zhang. Transelliptical graphical models. In P. L. Bartlett, F. C. N. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger, editors,Advances in Neural Information Processing Systems 25: 26th Annual Conference on Neural Information Processing Systems 2012. Proceedings of a meeting held December 3-6, 2012, Lake Tahoe, Nevada, United States., pages 809-817, 2012b.
[60] W. Liu. Gaussian graphical model estimation with false discovery rate control.Ann. Stat., 41(6):2948-2978, 2013. · Zbl 1288.62094
[61] W. Liu and X. Luo. Fast and adaptive sparse precision matrix estimation in high dimensions. J. Multivar. Anal., 135:153-162, 2015. · Zbl 1307.62148
[62] W. Liu and Q.-M. Shao. A Cram´er moderate deviation theorem for Hotelling’sT2-statistic with applications to global tests.Ann. Stat., 41(1):296-322, 2013. · Zbl 1347.62032
[63] J. Lu, M. Kolar, and H. Liu. Post-regularization inference for time-varying nonparanormal graphical models.Journal of Machine Learning Research, 18(203):1-78, 2018. · Zbl 1473.62198
[64] C. Ma, J. Lu, and H. Liu. Inter-subject analysis: Inferring sparse interactions with dense intra-graphs.arXiv: 1709.07036, 2017,arXiv:1709.07036v1.
[65] N. Meinshausen and P. B¨uhlmann. High dimensional graphs and variable selection with the lasso.Ann. Stat., 34(3):1436-1462, 2006. · Zbl 1113.62082
[66] S. Na, M. Kolar, and O. Koyejo. Estimating differential latent variable graphical models with applications to brain connectivity.arXiv:, 2019,arXiv:1909.05892v1.
[67] S. N. Negahban, P. Ravikumar, M. J. Wainwright, and B. Yu. A unified framework for high-dimensional analysis ofm-estimators with decomposable regularizers.Stat. Sci., 27 (4):538-557, 2012. · Zbl 1331.62350
[68] J. Neyman. Optimal asymptotic tests of composite statistical hypotheses.Probability and statistics, 57:213, 1959. · Zbl 0104.12602
[69] Y. Ning and H. Liu. A general theory of hypothesis tests and confidence regions for sparse high dimensional models.Ann. Statist., 45(1):158-195, 2017. · Zbl 1364.62128
[70] M. Parry, A. P. Dawid, and S. L. Lauritzen. Proper local scoring rules.Ann. Stat., 40(1): 561-592, 2012. · Zbl 1246.62011
[71] S. L. Portnoy. Asymptotic behavior of likelihood methods for exponential families when the number of parameters tends to infinity.Ann. Stat., 16(1):356-366, 1988. · Zbl 0637.62026
[72] B. M. P¨otscher. Confidence sets based on sparse estimators are necessarily large.Sankhy¯a, 71(1, Ser. A):1-18, 2009. · Zbl 1192.62096
[73] P. Ravikumar, M. J. Wainwright, G. Raskutti, and B. Yu. High-dimensional covariance estimation by minimizing‘1-penalized log-determinant divergence.Electron. J. Stat., 5: 935-980, 2011. · Zbl 1274.62190
[74] P. Ravikumar, M. J. Wainwright, and J. D. Lafferty. High-dimensional ising model selection using‘1-regularized logistic regression.Ann. Stat., 38(3):1287-1319, 2010. · Zbl 1189.62115
[75] Z. Ren, T. Sun, C.-H. Zhang, and H. H. Zhou. Asymptotic normality and optimalities in estimation of large Gaussian graphical models.Ann. Stat., 43(3):991-1026, 2015. · Zbl 1328.62342
[76] A. J. Rothman, P. J. Bickel, E. Levina, and J. Zhu. Sparse permutation invariant covariance estimation.Electron. J. Stat., 2:494-515, 2008. · Zbl 1320.62135
[77] K. Sachs, O. Perez, D. Pe’er, D. A. Lauffenburger, and G. P. Nolan. Causal protein-signaling networks derived from multiparameter single-cell data.Science, 308(5721):523-529, 2005.
[78] B. Sriperumbudur, K. Fukumizu, A. Gretton, A. Hyv¨arinen, and R. Kumar. Density estimation in infinite dimensional exponential families.J. Mach. Learn. Res., 18:Paper No. 57, 59, 2017. · Zbl 1440.62125
[79] A. S. Suggala, M. Kolar, and P. Ravikumar. The Expxorcist: Nonparametric graphical models via conditional exponential densities. In I. Guyon, U. von Luxburg, S. Bengio, H. M. Wallach, R. Fergus, S. V. N. Vishwanathan, and R. Garnett, editors,Advances in Neural Information Processing Systems 30: Annual Conference on Neural Information Processing Systems 2017, 4-9 December 2017, Long Beach, CA, USA, pages 4449-4459, 2017.
[80] S. Sun, M. Kolar, and J. Xu. Learning structured densities via infinite dimensional exponential families. In C. Cortes, N. D. Lawrence, D. D. Lee, M. Sugiyama, and R. Garnett, editors,Advances in Neural Information Processing Systems 28, pages 2287-2295. Curran Associates, Inc., 2015.
[81] T. Sun and C.-H. Zhang. Sparse matrix inversion with scaled lasso.J. Mach. Learn. Res., 14:3385-3418, 2013. · Zbl 1318.62184
[82] J. E. Taylor, R. Lockhart, R. J. Tibshirani, and R. J. Tibshirani. Exact post-selection inference for forward stepwise and least angle regression.ArXiv e-prints, arXiv:1401.3889, 2014.
[83] S. A. van de Geer. High-dimensional generalized linear models and the lasso.Ann. Stat., 36(2):614-645, 2008. · Zbl 1138.62323
[84] S. A. van de Geer, P. B¨uhlmann, Y. Ritov, and R. Dezeure. On asymptotically optimal confidence regions and tests for high-dimensional models.Ann. Stat., 42(3):1166-1202, 2014. · Zbl 1305.62259
[85] A. W. van der Vaart.Asymptotic statistics, volume 3 ofCambridge Series in Statistical and Probabilistic Mathematics. Cambridge University Press, Cambridge, 1998. · Zbl 0910.62001
[86] J. Wang and M. Kolar. Inference for sparse conditional precision matrices.ArXiv e-prints, arXiv:1412.7638, 2014,arXiv:1412.7638.
[87] J. Wang and M. Kolar. Inference for high-dimensional exponential family graphical models. In A. Gretton and C. C. Robert, editors,Proceedings of the 19th International Conference on Artificial Intelligence and Statistics, volume 51 ofProceedings of Machine Learning Research, pages 1042-1050, Cadiz, Spain, 2016. PMLR.
[88] L. A. Wasserman, M. Kolar, and A. Rinaldo. Berry-Esseen bounds for estimating undirected graphs.Electron. J. Stat., 8:1188-1224, 2014. · Zbl 1298.62089
[89] Y. Xia, T. Cai, and T. T. Cai.Testing differential networks with applications to the detection of gene-gene interactions.Biometrika, 102(2):247-266, 2015. · Zbl 1452.62392
[90] L. Xue and H. Zou. Regularized rank-based estimation of high-dimensional nonparanormal graphical models.Ann. Stat., 40(5):2541-2571, 2012. · Zbl 1373.62138
[91] L. Xue, H. Zou, and T. Ca. Nonconcave penalized composite conditional likelihood estimation of sparse ising models.Ann. Stat., 40(3):1403-1429, 2012. · Zbl 1284.62451
[92] E. Yang, G. I. Allen, Z. Liu, and P. Ravikumar. Graphical models via generalized linear models. In F. Pereira, C. Burges, L. Bottou, and K. Weinberger, editors,Advances in Neural Information Processing Systems 25, pages 1358-1366. Curran Associates, Inc., 2012.
[93] E. Yang, Y. Baker, P. Ravikumar, G. I. Allen, and Z. Liu. Mixed graphical models via exponential families. InProc. 17th Int. Conf, Artif. Intel. Stat., pages 1042-1050, 2014.
[94] E. Yang, P. Ravikumar, G. I. Allen, and Z. Liu. Graphical models via univariate exponential family distributions.J. Mach. Learn. Res., 16:3813-3847, 2015. · Zbl 1351.62111
[95] F. Yang, R. F. Barber, P. Jain, and J. Lafferty. Selective inference for group-sparse linear models. InAdvances in Neural Information Processing Systems, pages 2469-2477, 2016.
[96] M. Yu, V. Gupta, and M. Kolar. Statistical inference for pairwise graphical models using score matching. InAdvances in Neural Information Processing Systems 29. Curran Associates, Inc., 2016.
[97] M. Yu, V. Gupta, and M. Kolar.Constrained high dimensional statistical inference. arXiv:1911.07319, 2020,arXiv:1911.07319v1.
[98] S. Yu, M. Drton, and A. Shojaie. Graphical models for non-negative data using generalized score matching. In A. Storkey and F. Perez-Cruz, editors,Proceedings of the Twenty-First International Conference on Artificial Intelligence and Statistics, volume 84 ofProceedings of Machine Learning Research, pages 1781-1790, Playa Blanca, Lanzarote, Canary Islands, 2018. PMLR.
[99] S. Yu, M. Drton, and A. Shojaie. Generalized score matching for non-negative data.J. Mach. Learn. Res., 20:Paper No. 76, 70, 2019. · Zbl 1489.62082
[100] M. Yuan. High dimensional inverse covariance matrix estimation via linear programming. J. Mach. Learn. Res., 11:2261-2286, 2010. · Zbl 1242.62043
[101] M. Yuan and Y. Lin. Model selection and estimation in the gaussian graphical model. Biometrika, 94(1):19-35, 2007. · Zbl 1142.62408
[102] C.-H. Zhang and S. S. Zhang. Confidence intervals for low dimensional parameters in high dimensional linear models.J. R. Stat. Soc. B, 76(1):217-242, 2013. · Zbl 1411.62196
[103] X. Zhang and G. Cheng. Simultaneous inference for high-dimensional linear models.J. Amer. Statist. Assoc., 112(518):757-768, 2017.
[104] B. Zhao, Y. S. Wang, and M. Kolar. Direct estimation of differential functional graphical models. In H. M. Wallach, H. Larochelle, A. Beygelzimer, F. d’Alch´e-Buc, E. B. Fox, and R. Garnett, editors,Advances in Neural Information Processing Systems 32: Annual Conference on Neural Information Processing Systems 2019, NeurIPS 2019, 8-14 December 2019, Vancouver, BC, Canada, pages 2571-2581, 2019.
[105] B. Zhao, Y. S. Wang, and M. Kolar. Fudge: Functional differential graph estimation with fully and discretely observed curves.arXiv:2003.05402, 2020,arXiv:2003.05402v1.
[106] T. Zhao, M. Kolar, and H. Liu. A general framework for robust testing and confidence regions in high-dimensional quantile regression.ArXiv e-prints, arXiv:1412.8724, 2014, arXiv:1412.8724.
[107] T. Zhao and H. Liu. Calibrated precision matrix estimation for high dimensional elliptical distributions.IEEE Trans. Inf. Theory, pages 1-1, 2014.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.