Park, Trevor; Casella, George The Bayesian Lasso. (English) Zbl 1330.62292 J. Am. Stat. Assoc. 103, No. 482, 681-686 (2008). Summary: The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant. Cited in 9 ReviewsCited in 426 Documents MSC: 62J07 Ridge regression; shrinkage estimators (Lasso) 62F15 Bayesian inference 60J22 Computational methods in Markov chains 65C60 Computational problems in statistics (MSC2010) Keywords:empirical Bayes; Gibbs sampler; hierarchical model; inverse Gaussian; linear regression; penalized regression; scale mixture of normals PDFBibTeX XMLCite \textit{T. Park} and \textit{G. Casella}, J. Am. Stat. Assoc. 103, No. 482, 681--686 (2008; Zbl 1330.62292) Full Text: DOI