The Bayesian Lasso

Trevor Park, George Casella

Research output: Contribution to journalArticle

Abstract

The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.

Original languageEnglish (US)
Pages (from-to)681-686
Number of pages6
JournalJournal of the American Statistical Association
Volume103
Issue number482
DOIs
StatePublished - Jun 2008
Externally publishedYes

Keywords

  • Empirical Bayes
  • Gibbs sampler
  • Hierarchical model
  • Inverse Gaussian
  • Linear regression
  • Penalized regression
  • Scale mixture of normals

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint Dive into the research topics of 'The Bayesian Lasso'. Together they form a unique fingerprint.

  • Cite this