×

Central limit theorem for sequential Monte Carlo methods and its application to Bayesian inference. (English) Zbl 1079.65006

A particle system is a collection \((\vartheta^{(j,H)},w^{(j,H)})_{j\leq H}\) where \(\vartheta^{(j,H)}\in\Theta\) are “particles” and \(w^{(j,H)}>0\) are their “weights”. The system targets a distribution \(\pi\) on \(\Theta\) if for any measurable \(\varphi\) with \(| \mathbf{E}_\pi (\varphi)| <\infty\), \[ \hat E_H(\varphi)= { \sum_{j=1}^H w^{(j,H)}\varphi(\vartheta^{(j,H)}) \over \sum_{j=1}^H w^{(j,H)} } \to \mathbf{E}_\pi (\varphi). \] A sequential Monte Carlo algorithm (a particle filter) produces recursively (using mutation-correction-resampling scheme) a sequence of particle systems which target a sequence of distributions \(\pi_t\) on \(\Theta_t\). In the Bayes estimation problems \(\Theta_t=\Theta\) is the parameter space and \(\pi_t\) is an a posteriori distribution of the parameter \(\vartheta\) given the sample of size \(t\). In the state-space filtering or smoothing \(\Theta_t\) is the space of states trajectories and \(\pi_t\) is the conditional distribution of the trajectory given the data.
The author obtains conditions for the central limit theorem of the form \(\sqrt{H}(\hat E_H(\varphi)-\mathbf{E}_\pi (\varphi)) \Rightarrow N(0,V_t(\varphi))\) where \(V_t(\varphi)\) is described using recursive formulae. These conditions hold for many of sequential Monte Carlo algorithms including the resample-move algorithm and the residual resampling scheme. Asymptotics of \(V_t(\varphi)\) as \(t\to\infty\) are investigated for Bayesian problems.

MSC:

65C05 Monte Carlo methods
62F15 Bayesian inference
60F05 Central limit and other weak theorems
62L10 Sequential statistical analysis
PDFBibTeX XMLCite
Full Text: DOI arXiv

References:

[1] Andrieu, C. and Doucet, A. (2002). Particle filtering for partially observed Gaussian state space models. J. R. Stat. Soc. Ser. B Stat. Methodol. 64 827–836. · Zbl 1067.62098
[2] Baker, J. E. (1985). Adaptive selection methods for genetic algorithms. In Proc. International Conference on Genetic Algorithms and Their Applications (J. Grefenstette, ed.) 101–111. Erlbaum, Mahwah, NJ. · Zbl 0676.68043
[3] Baker, J. E. (1987). Reducing bias and inefficiency in the selection algorithm. In Genetic Algorithms and Their Applications (J. Grefenstette, ed.) 14–21. Erlbaum, Mahwah, NJ.
[4] Billingsley, P. (1995). Probability and Measure , 3rd ed. Wiley, New York. · Zbl 0822.60002
[5] Cappé, O., Guillin, A., Marin, J. M. and Robert, C. P. (2004). Population Monte Carlo. J. Comput. Graph. Statist. 13 907–929.
[6] Carpenter, J., Clifford, P. and Fearnhead, P. (1999). Improved particle filter for nonlinear problems. IEE Proc. Radar Sonar Navigation 146 2–7.
[7] Chen, R. and Liu, J. (2000). Mixture Kalman filters. J. R. Stat. Soc. Ser. B Stat. Methodol. 62 493–508. · Zbl 0953.62100
[8] Chopin, N. (2001). Sequential inference and state number determination for discrete state-space models through particle filtering. CREST Working Paper 2001-34.
[9] Chopin, N. (2002). A sequential particle filter method for static models. Biometrika 89 539–552. · Zbl 1036.62062
[10] Crisan, D. and Doucet, A. (2000). Convergence of sequential Monte Carlo methods. Technical Report CUED/F-INFENG/TR381, Cambridge Univ.
[11] Crisan, D., Gaines, J. and Lyons, T. (1998). Convergence of a branching particle method to the solution of the Zakai equation. SIAM J. Appl. Math. 58 1568–1590. · Zbl 0915.93060
[12] Crisan, D. and Lyons, T. (1997). Non-linear filtering and measure-valued processes. Probab. Theory Related Fields 109 217–244. · Zbl 0888.93056
[13] Crisan, D. and Lyons, T. (1999). A particle approximation of the solution of the Kushner–Stratonovich equation. Probab. Theory Related Fields 115 549–578. · Zbl 0951.93068
[14] Crisan, D. and Lyons, T. (2002). Minimal entropy approximations and optimal algorithms. Monte Carlo Methods Appl. 8 343–355. · Zbl 1018.65014
[15] Del Moral, P. and Doucet, A. (2002). Sequential Monte Carlo samplers. Technical Report CUED/F-INFENG/TR443, Cambridge Univ. · Zbl 1105.62034
[16] Del Moral, P. and Guionnet, A. (1999). Central limit theorem for nonlinear filtering and interacting particle systems. Ann. Appl. Probab. 9 275–297. · Zbl 0938.60022
[17] Del Moral, P. and Guionnet, A. (2001). On the stability of interacting processes with applications to filtering and genetic algorithms. Ann. Inst. H. Poincaré Probab. Statist. 37 155–194. · Zbl 0990.60005
[18] Del Moral, P. and Miclo, L. (2000). Branching and interacting particle systems approximations of Feynman–Kac formulae with applications to non-linear filtering. Séminaire de Probabilités XXXIV . Lecture Notes in Math. 1729 1–145. Springer, Berlin. · Zbl 0963.60040
[19] Dobrushin, R. L. (1956). Central limit theorem for non-stationary Markov chains I, II. Theory Probab. Appl. 1 65–80, 329–383. · Zbl 0093.15001
[20] Doucet, A., de Freitas, N. and Gordon, N. J., eds. (2001). Sequential Monte Carlo Methods in Practice . Springer, New York. · Zbl 0967.00022
[21] Doucet, A., Godsill, S. and Andrieu, C. (2000). On sequential Monte Carlo sampling methods for Bayesian filtering. Statist. Comput. 10 197–208.
[22] Gilks, W. R. and Berzuini, C. (2001). Following a moving target—Monte Carlo inference for dynamic Bayesian models. J. R. Stat. Soc. Ser. B Stat. Methodol. 63 127–146. · Zbl 0976.62021
[23] Gordon, N. J., Salmond, D. J. and Smith, A. F. M. (1993). Novel approach to nonlinear/non-Gaussian Bayesian state estimation. IEE Proc. F Radar Signal Proc. 140 107–113.
[24] Künsch, H.-R. (2001). State space and hidden Markov models. In Complex Stochastic Systems (O. E. Barndorff-Nielsen, D. R. Cox and C. Klüppelberg, eds.) 109–173. Chapman and Hall, London. · Zbl 1002.62072
[25] Künsch, H.-R. (2003). Recursive Monte Carlo filters: Algorithms and theoretical analysis. Technical Report 112, Seminar für Statistik, ETH Zürich. · Zbl 1086.62106
[26] Le Gland, F. and Oudjane, N. (2004). Stability and uniform approximation of nonlinear filters using the Hilbert metric, and application to particle filters. Ann. Appl. Probab. 14 144–187. · Zbl 1060.93094
[27] Liu, J. and Chen, R. (1998). Sequential Monte Carlo methods for dynamic systems. J. Amer. Statist. Assoc. 93 1032–1044. · Zbl 1064.65500
[28] Pitt, M. and Shephard, N. (1999). Filtering via simulation: Auxiliary particle filters. J. Amer. Statist. Assoc. 94 590–599. · Zbl 1072.62639
[29] Robert, C. P. and Casella, G. (1999). Monte Carlo Statistical Methods . Springer, New York. · Zbl 0935.62005
[30] Rubin, D. (1988). Using the SIR algorithm to simulate posterior distributions. In Bayesian Statistics 3 (J. M. Bernardo, M. H. DeGroot, D. V. Lindley and A. F. M. Smith, eds.) 395–402. Oxford Univ. Press. · Zbl 0713.62035
[31] Schervish, M. J. (1995). Theory of Statistics . Springer, New York. · Zbl 0834.62002
[32] Tierney, L., Kass, R. E. and Kadane, J. B. (1989). Fully exponential Laplace approximations to expectations and variances of nonpositive functions. J. Amer. Statist. Assoc. 84 710–716. · Zbl 0682.62012
[33] Whitley, D. (1994). A genetic algorithm tutorial. Statist. Comput. 4 65–85.
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.