×

Outperforming the Gibbs sampler empirical estimator for nearest-neighbor random fields. (English) Zbl 0871.62083

Summary: Given a Markov chain sampling scheme, does the standard empirical estimator make best use of the data? We show that this is not so and construct better estimators. We restrict attention to nearest-neighbor random fields and to Gibbs samplers with deterministic sweep, but our approach applies to any sampler that uses reversible variable-at-a-time updating with deterministic sweep. The structure of the transition distribution of the sampler is exploited to construct further empirical estimators that are combined with the standard empirical estimator to reduce asymptotic variance. The extra computational cost is negligible. When the random field is spatially homogeneous, symmetrizations of our estimator lead to further variance reduction. The performance of the estimators is evaluated in a simulation study of the Ising model.

MSC:

62M40 Random fields; image analysis
60J05 Discrete-time Markov processes on general state spaces
65C99 Probabilistic methods, stochastic differential equations
62G20 Asymptotic properties of nonparametric inference
62M05 Markov processes: estimation; hidden Markov models
Full Text: DOI

References:

[1] Amit, Y. and Grenander, U. (1991). Comparing sweep strategies for stochastic relaxation. J. Multivariate Anal. 37 197-222. · Zbl 0735.60035 · doi:10.1016/0047-259X(91)90080-L
[2] Athrey a, K. B., Doss, H. and Sethuraman, J. (1996). On the convergence of the Markov chain simulation method. Ann. Statist. 24 69-100. · Zbl 0860.60057 · doi:10.1214/aos/1033066200
[3] Besag, J. and Green, P. J. (1993). Spatial statistics and Bayesian inference. J. Roy. Statist. Soc. Ser. B 55 25-37. JSTOR: · Zbl 0800.62572
[4] Bickel, P. J., Klaassen, C. A. J., Ritov, Y. and Wellner, J. A. (1993). Efficient and Adaptive Estimation for Semiparametric Models. Johns Hopkins Univ. Press. · Zbl 0786.62001
[5] Binder, K., ed. (1992). The Monte Carlo Method in Condensed Matter physics. Springer, Berlin.
[6] Brillinger, D. R. (1963). A note on re-use of samples. Ann. Math. Statist. 34 341-343. · Zbl 0111.15804 · doi:10.1214/aoms/1177704276
[7] Chan, K. S. (1993). Asy mptotic behavior of the Gibbs sampler. J. Amer. Statist. Assoc. 88 320-326. JSTOR: · Zbl 0779.62024 · doi:10.2307/2290727
[8] Chan, K. S. and Gey er, C. J. (1994). Comment on ”Markov chains for exploring posterior distributions,” by L. Tierney. Ann. Statist. 22 1747-1758. · Zbl 0829.62080 · doi:10.1214/aos/1176325750
[9] Frigessi, A., Hwang, C.-R., Sheu, S. J. and Di Stefano, P. (1993). Convergence rates of the Gibbs sampler, the Metropolis algorithm, and other single-site updating dy namics. J. Roy. Statist. Soc. Ser. B 55 205-220. JSTOR: · Zbl 0781.60039
[10] Frigessi, A., Hwang, C.-R. and Younes, L. (1992). Optimal spectral structure of reversible stochastic matrices, Monte Carlo methods and the simulation of Markov random fields. Ann. Appl. Probab. 2 610-628. · Zbl 0756.60057 · doi:10.1214/aoap/1177005652
[11] Gelfand, A. E., Hills, S. E., Racine-Poon, A. and Smith, A. F. M. (1990). Illustration of Bayesian inference in normal data models using Gibbs sampling. J. Amer. Statist. Assoc. 85 972-985.
[12] Gelfand, A. E. and Smith, A. F. M. (1990). Sampling-based approaches to calculating marginal densities. J. Amer. Statist. Assoc. 85 398-409. JSTOR: · Zbl 0702.62020 · doi:10.2307/2289776
[13] Geman, S. and Geman, D. (1984). Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell. 6 721-741. · Zbl 0573.62030 · doi:10.1109/TPAMI.1984.4767596
[14] George, E. I. and Robert, C. P. (1992). Capture-recapture estimation via Gibbs sampling. Biometrika 79 677-683. JSTOR: · Zbl 0764.62028
[15] Geweke, J. (1992). Evaluating the accuracy of sampling-based approaches to the calculation of posterior moments (with discussion). In Bayesian Statistics (J. M. Bernardo, J. O. Berger, A. P. Dawid and A. F. M. Smith, eds.) 4 169-193. Oxford Univ. Press.
[16] Gey er, C. J. (1992). Practical Markov chain Monte Carlo. Statist. Sci. 7 473-483.
[17] Graham, J. (1994). Monte Carlo Markov chain likelihood ratio test and Wald test for binary spatial lattice data.
[18] Green, P. J. and Han, X.-L. (1992). Metropolis methods, Gaussian proposals and antithetic variables. In Stochastic Models, Statistical Methods, and Algorithms in Image Analy sis. Lecture Notes in Statist. (P. Barone, A. Frigessi and M. Piccioni, eds.) 74 142-164. Springer, Berlin.
[19] Greenwood, P. E., McKeague, I. W. and Wefelmey er, W. (1995). Information bounds for Gibbs samplers. Unpublished manuscript.
[20] Grenander, U. (1983). Tutorial in pattern theory. Lecture Notes, Division Appl. Math., Brown Univ. · Zbl 0542.68072
[21] Grenander, U. (1993). General Pattern Theory. A Mathematical Study of Regular Structures. Clarendon, Oxford. · Zbl 0827.68098
[22] Guerra, F., Rosen, L. and Simon, B. (1975). The P 2 Euclidean quantum field theory as classical statistical mechanics. Ann. Math. 101 111-259. JSTOR: · doi:10.2307/1970988
[23] Heermann, D. W. and Burkitt, A. N. (1992). Parallel algorithms for statistical physics problems. In The Monte Carlo Method in Condensed Matter physics (K. Binder, ed.) 53-74. Springer, Berlin.
[24] Ingrassia, S. (1994). On the rate of convergence of the Metropolis algorithm and Gibbs sampler by geometric bounds. Ann. Appl. Probab. 4 347-389. · Zbl 0802.60061 · doi:10.1214/aoap/1177005064
[25] Israel, R. B. (1979). Convexity in the Theory of Lattice Gases. Princeton Univ. Press. · Zbl 0399.46055
[26] Kindermann, R. and Snell, J. L. (1980). Markov Random Fields and Their Applications. Amer. Math. Soc., Providence. · Zbl 1229.60003
[27] Levit, B. Ya. (1974). On optimality of some statistical estimates. In Proceedings of the Prague Sy mposium on Asy mptotic Statistics (J. Hájek, ed.) 2 215-238. Charles Univ., Prague. · Zbl 0351.62023
[28] Mey n, S. P. and Tweedie, R. L. (1993). Markov Chains and Stochastic Stability. Springer, London. · Zbl 0925.60001
[29] Pearl, J. (1987). Evidential reasoning using stochastic simulation. Artificial Intelligence 32 245-257. · Zbl 0642.68177 · doi:10.1016/0004-3702(87)90012-9
[30] Penev, S. (1991). Efficient estimation of the stationary distribution for exponentially ergodic Markov chains. J. Statist. Plann. Inference 27 105-123. · Zbl 0727.62079 · doi:10.1016/0378-3758(91)90085-S
[31] Peskun, P. H. (1973). Optimum Monte Carlo sampling using Markov chains. Biometrika 60 607-612. JSTOR: · Zbl 0271.62041 · doi:10.1093/biomet/60.3.607
[32] Schervish, M. J. and Carlin, B. P. (1992). On the convergence of successive substitution sampling. J. Comput. Graph. Statist. 1 111-127. JSTOR: · doi:10.2307/1390836
[33] Simon, B. (1974). The P 2 Euclidean (Quantum) Field Theory. Princeton Univ. Press. · Zbl 1175.81146
[34] Smith, A. F. M. and Roberts, G. O. (1993). Bayesian computation via the Gibbs sampler and related Markov chain Monte Carlo methods. J. Roy. Statist. Soc. Ser. B 55 3-23. JSTOR: · Zbl 0779.62030
[35] Spiegelhalter, D. J., Dawid, A. P., Lauritzen, S. L. and Cowell, R. G. (1993). Bayesian analysis in expert sy stems. Statist. Sci. 8 219-283. · Zbl 0955.62523 · doi:10.1214/ss/1177010888
[36] Swendsen, R. H. and Wang, J.-S. (1987). Nonuniversal critical dy namics in Monte Carlo simulations. Phy s. Rev. Lett. 58 86-88.
[37] Tanner, M. A. and Wong, W. H. (1987). The calculation of posterior distributions by data augmentation. J. Amer. Statist. Assoc. 82 528-540. JSTOR: · Zbl 0619.62029 · doi:10.2307/2289457
[38] Tierney, L. (1994). Markov chains for exploring posterior distributions (with discussion). Ann. Statist. 22 1701-1762. · Zbl 0829.62080 · doi:10.1214/aos/1176325750
[39] Winkler, G. (1995). Image Analy sis, Random Fields and Dy namic Monte Carlo Methods. Springer, Berlin. · Zbl 0821.68125
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.