×

Bayesian analysis of the logit model and comparison of two Metropolis-Hastings strategies. (English) Zbl 1132.62311

Summary: We examine some Markov chain Monte Carlo (MCMC) methods for a generalized nonlinear regression model, the Logit model. It is first shown that MCMC algorithms may be used since the posterior is proper under the choice of non-informative priors. Then two non-standard MCMC methods are compared: a Metropolis-Hastings algorithm with a bivariate normal proposal resulting from an approximation, and a Metropolis-Hastings algorithm with an adaptive proposal. The results presented here are illustrated by simulations, and show the good behavior of both methods, and superior performances of the method with an adaptive proposal in terms of convergence to the stationary distribution and exploration of the posterior distribution surface.

MSC:

62F15 Bayesian inference
65C40 Numerical analysis or methods applied to Markov chains

Software:

Gibbsit; CODA
PDFBibTeX XMLCite
Full Text: DOI

References:

[1] Best, N.G., Cowles, M.K., Vines, K., 1995. CODA: convergence diagnosis and output analysis software for Gibbs sampling output, Version 0.30. Technical Report, MRC Biostatistics Unit, University of Cambridge.
[2] Brooks, S. P.; Roberts, G.: Assessing convergence of Markov chain Monte Carlo algorithms. Statist. comput. 8, 319-335 (1998)
[3] Chauveau, D.; Vandekerkhove, P.: Un algorithme de Hastings–metropolis avec apprentissage séquentiel. C. R. Acad. sci. Paris, série I t 329, 173-176 (1999) · Zbl 0946.60062
[4] Chauveau, D., Vandekerkhove, P., 2001. Improving convergence of the Hastings–Metropolis algorithm with an adaptive proposal, Scand. J. Statist., to appear. · Zbl 1023.65003
[5] Cowles, M. K.; Carlin, B. P.: Markov chain Monte-Carlo convergence diagnostics: a comparative study. J. amer. Statist. assoc. 91, 883-904 (1996) · Zbl 0869.62066
[6] Gelman, A.; Gilks, W. R.; Roberts, G. O.: Efficient metropolis jumping rules.. Bayesian statistics, vol. 5. 5, 599-608 (1996)
[7] Geman, S.; Geman, D.: Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images. IEEE trans. Pattern anal. Mach intell. 6, 721-741 (1984) · Zbl 0573.62030
[8] Geweke, J.: Evaluating the accuracy of sampling-based approaches to the calculation of posterior moments (with discussion).. Bayesian statistics, vol. 4. 4, 169-193 (1992)
[9] Hastings, W. K.: Monte Carlo sampling methods using Markov chains and their application. Biometrika 57, 97-109 (1970) · Zbl 0219.65008
[10] Holden, L.: Geometric convergence of the metropolis–Hastings simulation algorithm. Statist. probab. Lett. 39, No. 4, 371-377 (1998) · Zbl 0914.60043
[11] Mengersen, K. L.; Robert, C. P.; Guihenneuc-Jouyaux, C.: MCMC convergence diagnostics: a ”reviewww”.. Bayesian statistics, vol. 6. 6, 415-441 (1999) · Zbl 0957.62019
[12] Mengersen, K. L.; Tweedie, R. L.: Rates of convergence of the Hastings and metropolis algorithms. Ann. statist. 24, 101-121 (1996) · Zbl 0854.60065
[13] Raftery, A. E.; Lewis, S.: How many iterations in the Gibbs sampler?. Bayesian statistics, vol. 4. 4, 763-773 (1992)
[14] Raftery, A.E., Lewis, S., 1992b. The number of iterations, convergence diagnostics and generic Metropolis algorithms. Technical Report, Department of Statistics, University of Washington, Seattle.
[15] Robert, C. P.: Méthodes de Monte Carlo par chaı\hat{}nes de Markov. (1996) · Zbl 0917.60007
[16] Robert, C. P.: Inference in mixture models.. Markov chain Monte Carlo in practice., 441-464 (1996) · Zbl 0849.62013
[17] Tanner, M.; Wong, W.: The calculation of posterior distributions by data augmentation. J. amer. Statist. assoc. 82, 528-550 (1987) · Zbl 0619.62029
This reference list is based on information provided by the publisher or from digital mathematics libraries. Its items are heuristically matched to zbMATH identifiers and may contain data conversion errors. In some cases that data have been complemented/enhanced by data from zbMATH Open. This attempts to reflect the references listed in the original paper as accurately as possible without claiming completeness or a perfect matching.