Consistency of adaptive importance sampling and recycling schemes. (English) Zbl 1466.62157

Summary: Among Monte Carlo techniques, the importance sampling requires fine tuning of a proposal distribution, which is now fluently resolved through iterative schemes. Sequential adaptive algorithms have been proposed to calibrate the sampling distribution. J.-M. Cornuet et al. [Scand. J. Stat. 39, No. 4, 798–812 (2012; Zbl 1319.62059)] provides a significant improvement in stability and effective sample size by the introduction of a recycling procedure. However, the consistency of such algorithms have been rarely tackled because of their complexity. Moreover, the recycling strategy of the AMIS estimator adds another difficulty and its consistency remains largely open. In this work, we prove the convergence of sequential adaptive sampling, with finite Monte Carlo sample size at each iteration, and consistency of recycling procedures. Contrary to R. Douc et al. [Ann. Stat. 35, No. 1, 420–448 (2007; Zbl 1132.60022)], results are obtained here in the asymptotic regime where the number of iterations is going to infinity while the number of drawings per iteration is a fixed, but growing sequence of integers. Hence, some of the results shed new light on adaptive population Monte Carlo algorithms in that last regime and give advices on how the sample sizes should be fixed.


62-08 Computational methods for problems pertaining to statistics
60J22 Computational methods in Markov chains
62F12 Asymptotic properties of parametric estimators
65C05 Monte Carlo methods
Full Text: DOI Euclid


[1] Andrieu, C., Doucet, A. and Holenstein, R. (2010). Particle Markov chain Monte Carlo methods. J. R. Stat. Soc. Ser. B. Stat. Methodol.72 269-342. · Zbl 1411.65020
[2] Billingsley, P. (1995). Probability and Measure, 3rd ed. Wiley Series in Probability and Mathematical Statistics. New York: Wiley. · Zbl 0822.60002
[3] Bugallo, M.F., Martino, L. and Corander, J. (2015). Adaptive importance sampling in signal processing. Digit. Signal Process.47 36-49.
[4] Cameron, E. and Pettitt, A. (2014). Recursive pathways to marginal likelihood estimation with prior-sensitivity analysis. Statist. Sci.29 397-419. · Zbl 1331.62128
[5] Cappé, O., Guillin, A., Marin, J.M. and Robert, C.P. (2004). Population Monte Carlo. J. Comput. Graph. Statist.13 907-929.
[6] Cappé, O., Guillin, A., Marin, J.-M. and Robert, C.P. (2008). Adaptive importance sampling in general mixture classes. Stat. Comput.18 587-600.
[7] Cornuet, J.-M., Marin, J.-M., Mira, A. and Robert, C.P. (2012). Adaptive multiple importance sampling. Scand. J. Stat.39 798-812. · Zbl 1319.62059
[8] Del Moral, P., Doucet, A. and Jasra, A. (2006). Sequential Monte Carlo samplers. J. R. Stat. Soc. Ser. B. Stat. Methodol.68 411-436. · Zbl 1105.62034
[9] Douc, R., Guillin, A., Marin, J.-M. and Robert, C.P. (2007). Convergence of adaptive mixtures of importance sampling schemes. Ann. Statist.35 420-448. · Zbl 1132.60022
[10] Douc, R., Guillin, A., Marin, J.-M. and Robert, C.P. (2007). Minimum variance importance sampling via population Monte Carlo. ESAIM Probab. Stat.11 427-447. · Zbl 1181.60028
[11] Feroz, F., Hobson, M., Cameron, E. and Pettitt, A. (2013). Importance nested sampling and the MultiNest algorithm. Preprint. Available at arXiv:1306.2144.
[12] Forbes, F. and Fort, G. (2007). Combining Monte Carlo and mean-field-like methods for inference in hidden Markov random fields. IEEE Trans. Image Process.16 824-837.
[13] He, H.Y. and Owen, A.B. (2014). Optimal mixture weights in multiple importance sampling. Preprint. Available at arXiv:1411.3954.
[14] Hesterberg, T. (1995). Weighted average importance sampling and defensive mixture distributions. Technometrics37 185-194. · Zbl 0822.62002
[15] Liu, J.S. (2008). Monte Carlo Strategies in Scientific Computing. Springer Series in Statistics. New York: Springer. · Zbl 1132.65003
[16] Martino, L., Elvira, V., Luengo, D. and Corander, J. (2015). An adaptive population importance sampler: Learning from uncertainty. IEEE Trans. Signal Process.63 4422-4437. · Zbl 1394.94827
[17] Martino, L., Elvira, V., Luengo, D. and Corander, J. (2017). Layered adaptive importance sampling. Stat. Comput.27 599-623. · Zbl 1505.62276
[18] McLachlan, G.J. and Krishnan, T. (2007). The EM Algorithm and Extensions. New York: Wiley. · Zbl 1165.62019
[19] Owen, A. and Zhou, Y. (2000). Safe and effective importance sampling. J. Amer. Statist. Assoc.95 135-143. · Zbl 0998.65003
[20] Ripley, B.D. (1987). Stochastic Simulation. Wiley Series in Probability and Mathematical Statistics: Applied Probability and Statistics. New York: Wiley. · Zbl 0613.65006
[21] Robert, C.P. and Casella, G. (2004). Monte Carlo Statistical Methods, 2nd ed. Springer Texts in Statistics. New York: Springer. · Zbl 1096.62003
[22] Schuster, I. (2015). Gradient importance sampling. Preprint. Available at arXiv:1507.05781.
[23] Schuster, I. (2015). Consistency of importance sampling estimates based on dependent sample sets and an application to models with factorizing likelihoods. Preprint. Available at arXiv:1503.00357.
[24] Sirén, J., Marttinen, P. and Corander, J. (2010). Reconstructing population histories from single-nucleotide polymorphism data. Mol. Biol. Evol.28 673-683.
[25] Šmídl, V. and Hofman, R. (2014). Efficient sequential Monte Carlo sampling for continuous monitoring of a radiation situation. Technometrics56 514-528.
[26] Van der Vaart, A.W. (2000). Asymptotic Statistics. Cambridge: Cambridge University Press. · Zbl 0910.62001
[27] Veach, E. and Guibas, L.J. (1995). Optimally comabining sampling techniques for Monte Carlo rendering. In SIGGRAPH’95 Proceeding 419-428. Addison-Wesley.
[28] Xiong, X., Šmídl, V. and Filippone, M. (2017). Adaptive multiple importance sampling for Gaussian processes. J. Stat. Comput. Simul.87 1644-1665. · Zbl 07192021
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.