×

Augmentation schemes for particle MCMC. (English) Zbl 1356.65025

Summary: Particle MCMC involves using a particle filter within an MCMC algorithm. For inference of a model which involves an unobserved stochastic process, the standard implementation uses the particle filter to propose new values for the stochastic process, and MCMC moves to propose new values for the parameters. We show how particle MCMC can be generalised beyond this. Our key idea is to introduce new latent variables. We then use the MCMC moves to update the latent variables, and the particle filter to propose new values for the parameters and stochastic process given the latent variables. A generic way of defining these latent variables is to model them as pseudo-observations of the parameters or of the stochastic process. By choosing the amount of information these latent variables have about the parameters and the stochastic process we can often improve the mixing of the particle MCMC algorithm by trading off the Monte Carlo error of the particle filter and the mixing of the MCMC moves. We show that using pseudo-observations within particle MCMC can improve its efficiency in certain scenarios: dealing with initialisation problems of the particle filter; speeding up the mixing of particle Gibbs when there is strong dependence between the parameters and the stochastic process; and enabling further MCMC steps to be used within the particle filter.

MSC:

65C35 Stochastic particle methods
65C05 Monte Carlo methods
65C20 Probabilistic models, generic numerical methods in probability and statistics
62M09 Non-Markovian processes: estimation
PDFBibTeX XMLCite
Full Text: DOI arXiv Link

References:

[1] Amit, Y.: On rates of convergence of stochastic relaxation for Gaussian and non-Gaussian distributions. J. Multivar. Anal. 38(1), 82-99 (1991) · Zbl 0735.60036 · doi:10.1016/0047-259X(91)90033-X
[2] Andrieu, C., Roberts, G.O.: The pseudo-marginal approach for efficient computations. Ann. Stat. 37, 697-725 (2009) · Zbl 1185.60083 · doi:10.1214/07-AOS574
[3] Andrieu, C., Thoms, J.: A tutorial on adaptive MCMC. Stat. Comput. 18, 343-373 (2008) · doi:10.1007/s11222-008-9110-y
[4] Andrieu, C., Doucet, A., Holenstein, R.: Particle Markov chain Monte Carlo (with discussion). J. R. Stat. Soc. Ser. B 62, 269-342 (2010) · Zbl 1184.65001 · doi:10.1111/j.1467-9868.2009.00736.x
[5] Andrieu, C., Lee, A., Vihola, M.: Uniform ergodicity of the iterated conditional SMC and geometric ergodicity of particle Gibbs samplers. ArXiv e-prints 1312, 6432 (2013) · Zbl 1436.60069
[6] Blackwell, D., MacQueen, J.B.: Ferguson distributions via Polya urn schemes. Ann. Stat. 1, 353-355 (1973) · Zbl 0276.62010 · doi:10.1214/aos/1176342372
[7] Carvalho, C.M., Johannes, M.S., Lopes, H.F., Polson, N.G.: Particle learning and smoothing. Stat. Sci. 25(1), 88-106 (2010a) · Zbl 1328.62541 · doi:10.1214/10-STS325
[8] Carvalho, C.M., Lopes, H.F., Polson, N.G., Taddy, M.A.: Particle learning for general mixtures. Bayesian Anal. 5, 709-740 (2010b) · Zbl 1330.62348 · doi:10.1214/10-BA525
[9] Chopin, N., Singh, S.S.: On the particle Gibbs sampler. arXiv preprint arXiv:1304.1887 (2013)
[10] Dahlin, J., Lindsten, F., Schön, T.B.: Particle Metropolis-Hastings using gradient and Hessian information. Stat. Comput. 25(1), 1-12 (2014) · Zbl 1331.62134
[11] Del Moral, P.: Feynman-Kac Formulae: Genealogical and Interacting Particle Systems with Applications. Springer, New York (2004) · Zbl 1130.60003 · doi:10.1007/978-1-4684-9393-1
[12] Del Moral, P., Kohn, R., Patras, F.: On Feynman-Kac and particle Markov chain Monte Carlo models. arXiv preprint arXiv:1404.5733 (2014)
[13] Dembo, A., Kagan, A., Shepp, L.A., et al.: Remarks on the maximum correlation coefficient. Bernoulli 7(2), 343-350 (2001) · Zbl 0981.62051 · doi:10.2307/3318742
[14] Doucet, A., Godsill, S.J., Andrieu, C.: On sequential Monte Carlo sampling methods for Bayesian filtering. Stat. Comput. 10, 197-208 (2000) · doi:10.1023/A:1008935410038
[15] Doucet, A., Pitt, M.K., Deligiannidis, G., Kohn, R.: Efficient implementation of Markov chain Monte Carlo when using an unbiasedlikelihood estimator. Biometrika. 102, 295-313 (2015). doi:10.1093/biomet/asu075 · Zbl 1452.62055
[16] Falush, D., Stephens, M., Pritchard, J.K.: Inference of population structure using multilocus genotype data: linked loci and correlated allele frequencies. Genetics 164, 1567-1587 (2003)
[17] Fearnhead, P.: MCMC, sufficient statistics and particle filters. J. Comput. Graph. Stat. 11, 848-862 (2002) · doi:10.1198/106186002835
[18] Fearnhead, P.: Particle filters for mixture models with an unknown number of components. Stat. Comput. 14, 11-21 (2004) · doi:10.1023/B:STCO.0000009418.04621.cd
[19] Fearnhead, P.: Computational methods for complex stochastic systems: a review of some alternatives to MCMC. Stat. Comput. 18, 151-171 (2008) · doi:10.1007/s11222-007-9045-8
[20] Fearnhead, P.; Brooks, S. (ed.); Gelman, A. (ed.); Jones, GL (ed.); Meng, X. (ed.), MCMC for state-space models (2011), London · Zbl 1229.65011
[21] Fearnhead, P., Clifford, P.: Online inference for hidden Markov models. J. R. Stat. Soc. Ser. B 65, 887-899 (2003) · Zbl 1059.62098 · doi:10.1111/1467-9868.00421
[22] Ferguson, T.S.: A Bayesian analysis of some nonparametric problems. Ann. Stat. 1, 209-230 (1973) · Zbl 0255.62037 · doi:10.1214/aos/1176342360
[23] Gilks, W.R., Berzuini, C.: Following a moving target—Monte Carlo inference for dynamic Bayesian models. J. R. Stat. Soc. Ser. B 63, 127-146 (2001) · Zbl 0976.62021 · doi:10.1111/1467-9868.00280
[24] Gramacy, R.B., Polson, N.G.: Particle learning of Gaussian process models for sequential design and optimization. J. Comput. Graph. Stat. 20, 102-118 (2011) · doi:10.1198/jcgs.2010.09171
[25] Lindsten, F., Jordan, M.I., Schön, T.B.: Particle Gibbs with ancestor sampling. J. Mach. Learn. Res. 15(1), 2145-2184 (2014) · Zbl 1319.60151
[26] Liu, J.S.: Fraction of missing information and convergence rate of data augmentation. In: Computing Science and Statistics: Proceedings of the 26th Symposium on the Interface, Interface Foundation of North America, Fairfax Station, VA, pp. 490-496 (1994)
[27] Liu, J.S., Wong, W.H., Kong, A.: Covariance structure of the Gibbs sampler with applications to the comparisons of estimators and augmentation schemes. Biometrika 81(1), 27-40 (1994) · Zbl 0811.62080 · doi:10.1093/biomet/81.1.27
[28] Mendes, E.F., Carter, C.K., Kohn, R.: On general sampling schemes for Particle Markov chain Monte Carlo methods. arXiv preprint arXiv:1401.1667 (2014)
[29] Murray, L.M., Jones, E.M., Parslow, J.: On disturbance state-space models and the particle marginal Metropolis-Hastings sampler. SIAM/ASA J. Uncertain. Quantif. 1(1), 295-313 (2012) · Zbl 1282.65016
[30] Nemeth, C., Sherlock, C., Fearnhead, P.: Particle Metropolis adjusted Langevin algorithms. arXiv preprint arXiv:1412.7299 (2014) · Zbl 1506.62135
[31] Olsson, J., Ryden, T.: Rao-Blackwellization of particle Markov chain Monte Carlo methods using forward filtering backward sampling. Signal Process. IEEE Trans. 59(10), 4606-4619 (2011) · Zbl 1391.65010 · doi:10.1109/TSP.2011.2161296
[32] Patterson, N., Price, A.L., Reich, D.: Population structure and eigenanalysis. PLoS Genet. 2(12), e190 (2006) · doi:10.1371/journal.pgen.0020190
[33] Pitt, M.K., Shephard, N.: Analytic convergence rates, and parameterization issues for the Gibbs sampler applied to state space models. J. Time Ser. Anal. 20, 63-85 (1999) · Zbl 0924.62093 · doi:10.1111/1467-9892.00126
[34] Pitt, M.K., dos Santos Silva, R., Giordani, P., Kohn, R.: On some properties of Markov chain Monte Carlo simulation methods based on the particle filter. J. Econ. 171, 134-151 (2012) · Zbl 1443.62499 · doi:10.1016/j.jeconom.2012.06.004
[35] Price, A.L., Patterson, N.J., Plenge, R.M., Weinblatt, M.E., Shadick, N.A., Reich, D.: Principal components analysis corrects for stratification in genome-wide association studies. Nat. Genet. 38(8), 904-909 (2006) · doi:10.1038/ng1847
[36] Pritchard, J.K., Stephens, M., Donnelly, P.: Inference of population structure using multilocus genotype data. Genetics 155, 945-959 (2000a)
[37] Pritchard, J.K., Stephens, M., Rosenberg, N.A., Donnelly, P.: Association mapping in structured populations. Am. J. Hum. Genet. 67, 170-181 (2000b) · doi:10.1086/302959
[38] Rasmussen, D.A., Ratmann, O., Koelle, K.: Inference for nonlinear epidemiological models using genealogies and time series. PLoS Comput. Biol. 7(8), e1002136 (2011) · doi:10.1371/journal.pcbi.1002136
[39] Roberts, G.O., Rosenthal, J.S.: Optimal scaling for various Metropolis-Hastings algorithms. Stat. Sci. 16, 351-367 (2001) · Zbl 1127.65305 · doi:10.1214/ss/1015346320
[40] Rosenberg, N.A., Pritchard, J.K., Weber, J.L., Cann, H.M., Kidd, K.K., Zhivotovsky, L.A., Feldman, M.W.: Genetic structure of human populations. Science 298, 2381-2385 (2002) · doi:10.1126/science.1078311
[41] Sherlock, C., Thiery, A.H., Roberts, G.O.: On the efficiency of pseudo marginal random walk Metropolis algorithms. Ann. Stat. 43, 238-275 (2015) · Zbl 1326.65015 · doi:10.1214/14-AOS1278
[42] Storvik, G.: Particle filters for state-space models with the presence of unknown static parameters. IEEE Trans. Signal Process. 50, 281-289 (2002) · doi:10.1109/78.978383
[43] Tanner, M.A., Wong, W.H.: The calculation of posterior distributions by data augmentation. J. Am. Stat. Assoc. 82(398), 528-540 (1987) · Zbl 0619.62029 · doi:10.1080/01621459.1987.10478458
[44] van Dyk, D.A., Meng, X.-L.: The art of data augmentation. J. Comput. Graph. Stat. 10(1), 1-50 (2001) · doi:10.1198/10618600152418584
[45] Wood, F., vand de Meent, J.W., Mansinghka, V.: A new approach to probabilistic programming inference. In: Proceedings of the 17th International conference on Artificial Intelligence and Statistics (2014) · Zbl 1436.60069
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.