zbMATH — the first resource for mathematics

Gibbs sampling for a Bayesian hierarchical general linear model. (English) Zbl 1329.62336
Summary: We consider a Bayesian hierarchical version of the normal theory general linear model which is practically relevant in the sense that it is general enough to have many applications and it is not straightforward to sample directly from the corresponding posterior distribution. Thus we study a block Gibbs sampler that has the posterior as its invariant distribution. In particular, we establish that the Gibbs sampler converges at a geometric rate. This allows us to establish conditions for a central limit theorem for the ergodic averages used to estimate features of the posterior. Geometric ergodicity is also a key requirement for using batch means methods to consistently estimate the variance of the asymptotic normal distribution. Together, our results give practitioners the tools to be as confident in inferences based on the observations from the Gibbs sampler as they would be with inferences based on random samples from the posterior. Our theoretical results are illustrated with an application to data on the cost of health plans issued by health maintenance organizations.

62J12 Generalized linear models (logistic models)
62F15 Bayesian inference
62F12 Asymptotic properties of parametric estimators
60J20 Applications of Markov chains and discrete-time Markov processes on general state spaces (social mobility, learning theory, industrial processes, etc.)
BayesDA; WinBUGS
Full Text: DOI Euclid arXiv
[1] Bednorz, W. and Latuszynski, K. (2007). A few remarks on “Fixed-width output analysis for Markov chain Monte Carlo” by Jones et al., Journal of the American Statatistical Association , 102 1485-1486. · doi:10.1198/016214507000000914
[2] Chan, K. S. and Geyer, C. J. (1994). Comment on “Markov chains for exploring posterior distributions”., The Annals of Statistics , 22 1747-1758. · Zbl 0829.62080 · doi:10.1214/aos/1176325750
[3] Flegal, J. M., Haran, M. and Jones, G. L. (2008). Markov chain Monte Carlo: Can we trust the third significant figure?, Statistical Science , 23 250-260. · Zbl 1327.62017 · doi:10.1214/08-STS257
[4] Flegal, J. M. and Jones, G. L. (2010). Batch means and spectral variance estimators in Markov chain Monte Carlo., The Annals of Statistics , 38 1034-1070. · Zbl 1184.62161 · doi:10.1214/09-AOS735
[5] Gelman, A., Carlin, J. B., Stern, H. S. and Rubin, D. B. (2004)., Bayesian Data Analysis, Second edition . Chapman & Hall/CRC. · Zbl 1039.62018
[6] Glynn, P. W. and Whitt, W. (1992). The asymptotic validity of sequential stopping rules for stochastic simulations., The Annals of Applied Probability , 2 180-198. · Zbl 0792.68200 · doi:10.1214/aoap/1177005777
[7] Henderson, H. V. and Searle, S. R. (1981). On deriving the inverse of a sum of matrices., SIAM Review , 23 53-60. · Zbl 0451.15005 · doi:10.1137/1023004
[8] Hobert, J. P. and Geyer, C. J. (1998). Geometric ergodicity of Gibbs and block Gibbs samplers for a hierarchical random effects model., Journal of Multivariate Analysis , 67 414-430. · Zbl 0922.60069 · doi:10.1006/jmva.1998.1778
[9] Hobert, J. P., Jones, G. L., Presnell, B. and Rosenthal, J. S. (2002). On the applicability of regenerative simulation in Markov chain Monte Carlo., Biometrika , 89 731-743. · Zbl 1035.60080 · doi:10.1093/biomet/89.4.731
[10] Hobert, J. P., Jones, G. L. and Robert, C. P. (2006). Using a Markov chain to construct a tractable approximation of an intractable probability distribution., Scandinavian Journal of Statistics , 33 37-51. · Zbl 1122.65013 · doi:10.1111/j.1467-9469.2006.00467.x
[11] Hodges, J. S. (1998). Some algebra and geometry for hierarchical models, applied to diagnostics., Journal of the Royal Statistical Society, Series B , 60 497-536. · Zbl 0909.62072 · doi:10.1111/1467-9868.00137
[12] Jones, G. L., Haran, M., Caffo, B. S. and Neath, R. (2006). Fixed-width output analysis for Markov chain Monte Carlo., Journal of the American Statistical Association , 101 1537-1547. · Zbl 1171.62316 · doi:10.1198/016214506000000492
[13] Jones, G. L. and Hobert, J. P. (2001). Honest exploration of intractable probability distributions via Markov chain Monte Carlo., Statistical Science , 16 312-334. · Zbl 1127.60309 · doi:10.1214/ss/1015346317
[14] Jones, G. L. and Hobert, J. P. (2004). Sufficient burn-in for Gibbs samplers for a hierarchical random effects model., The Annals of Statistics , 32 784-817. · Zbl 1048.62069 · doi:10.1214/009053604000000184
[15] Meyn, S. P. and Tweedie, R. L. (1993)., Markov chains and Stochastic Stability . Springer, London. · Zbl 0925.60001
[16] Mykland, P., Tierney, L. and Yu, B. (1995). Regeneration in Markov chain samplers., Journal of the American Statistical Association , 90 233-241. · Zbl 0819.62082 · doi:10.2307/2291148
[17] Papaspiliopoulos, O. and Roberts, G. (2008). Stability of the Gibbs sampler for Bayesian hierarchical models., The Annals of Statistics , 36 95-117. · Zbl 1144.65007 · doi:10.1214/009053607000000749 · euclid:aos/1201877295
[18] Roberts, G. O. and Rosenthal, J. S. (2001). Markov chains and de-initializing processes., Scandinavian Journal of Statistics , 28 489-504. · Zbl 0985.60067 · doi:10.1111/1467-9469.00250
[19] Rosenthal, J. S. (1995). Rates of convergence for Gibbs sampling for variance component models., The Annals of Statistics , 23 740-761. · Zbl 0841.62074 · doi:10.1214/aos/1176324619
[20] Spiegelhalter, D., Thomas, A., Best, N. and Lunn, D. (2005). Winbugs version 2.10. Tech. rep., MRC Biostatistics Unit, Cambridge:, UK.
[21] Tan, A. and Hobert, J. P. (2009). Block Gibbs sampling for Bayesian random effects models with improper priors: convergence and regeneration., Journal of Computational and Graphical Statistics , 18 861-878. · doi:10.1198/jcgs.2009.08153
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. It attempts to reflect the references listed in the original paper as accurately as possible without claiming the completeness or perfect precision of the matching.