Abrahamsen, Tavis; Hobert, James P. Convergence analysis of block Gibbs samplers for Bayesian linear mixed models with \(p>N\). (English) Zbl 1368.62061 Bernoulli 23, No. 1, 459-478 (2017). This paper considers the Markov Chain Monte Carlo sampling for the Bayesian inference of the general linear mixed model, \(Y=X\beta + \sum_{i=1}^{r}Z_{i}u_{i} + e\), where \(Y\) is an observable data vector, \(X\) is the design matrix, \(\{Z_{i}\}_{i=1}^{r}\) are known matrices, \(\beta\) is an unknown regression coefficient, \(u=(u_{1}^{T}\;u_{2}^{T}\; \cdots\; u_{r}^{T})^{T}\) is a vector of Gaussian random vectors, and \(e\) is a Gaussian noise vector. As the main proposition of this paper, the authors show sufficient conditions for geometric ergodicity of the block Gibbs sampler of \(\theta=(\beta^{T}, u^{T})^{T}\) and the precision parameters of \(e\) and \(u\). Two corollaries of this proposition prove the convergence property of the block Gibbs sampler in the cases of a proper prior and an improper prior, respectively. The corollaries also remove assumptions on the rank of the design matrix \(X\), which were previously assumed in existing convergence analyses. Reviewer: Kazuho Watanabe (Toyohashi) Cited in 2 Documents MSC: 62F15 Bayesian inference 60J22 Computational methods in Markov chains 62J05 Linear regression; mixed models Keywords:conditionally conjugate prior; convergence rate; geometric ergodicity; improper prior PDFBibTeX XMLCite \textit{T. Abrahamsen} and \textit{J. P. Hobert}, Bernoulli 23, No. 1, 459--478 (2017; Zbl 1368.62061) Full Text: DOI arXiv Euclid