The Bayesian Lasso

Trevor H Park, George Casella

Research output: Contribution to journalArticle

Abstract

The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.

Original languageEnglish (US)
Pages (from-to)681-686
Number of pages6
JournalJournal of the American Statistical Association
Volume103
Issue number482
DOIs
StatePublished - Jun 1 2008

Fingerprint

Lasso
Regression
Estimate
Credible Interval
Inverse Gaussian Distribution
Gibbs Sampling
Likelihood Methods
Bayesian Methods
Hierarchical Model
Variable Selection
Conditional Distribution
Laplace
Linear regression
Interval

Keywords

  • Empirical Bayes
  • Gibbs sampler
  • Hierarchical model
  • Inverse Gaussian
  • Linear regression
  • Penalized regression
  • Scale mixture of normals

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Cite this

The Bayesian Lasso. / Park, Trevor H; Casella, George.

In: Journal of the American Statistical Association, Vol. 103, No. 482, 01.06.2008, p. 681-686.

Research output: Contribution to journalArticle

Park, Trevor H ; Casella, George. / The Bayesian Lasso. In: Journal of the American Statistical Association. 2008 ; Vol. 103, No. 482. pp. 681-686.
@article{9ed081d0a72841a999059e91695aea44,
title = "The Bayesian Lasso",
abstract = "The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.",
keywords = "Empirical Bayes, Gibbs sampler, Hierarchical model, Inverse Gaussian, Linear regression, Penalized regression, Scale mixture of normals",
author = "Park, {Trevor H} and George Casella",
year = "2008",
month = "6",
day = "1",
doi = "10.1198/016214508000000337",
language = "English (US)",
volume = "103",
pages = "681--686",
journal = "Journal of the American Statistical Association",
issn = "0162-1459",
publisher = "Taylor & Francis",
number = "482",

}

TY - JOUR

T1 - The Bayesian Lasso

AU - Park, Trevor H

AU - Casella, George

PY - 2008/6/1

Y1 - 2008/6/1

N2 - The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.

AB - The Lasso estimate for linear regression parameters can be interpreted as a Bayesian posterior mode estimate when the regression parameters have independent Laplace (i.e., double-exponential) priors. Gibbs sampling from this posterior is possible using an expanded hierarchy with conjugate normal priors for the regression parameters and independent exponential priors on their variances. A connection with the inverse-Gaussian distribution provides tractable full conditional distributions. The Bayesian Lasso provides interval estimates (Bayesian credible intervals) that can guide variable selection. Moreover, the structure of the hierarchical model provides both Bayesian and likelihood methods for selecting the Lasso parameter. Slight modifications lead to Bayesian versions of other Lasso-related estimation methods, including bridge regression and a robust variant.

KW - Empirical Bayes

KW - Gibbs sampler

KW - Hierarchical model

KW - Inverse Gaussian

KW - Linear regression

KW - Penalized regression

KW - Scale mixture of normals

UR - http://www.scopus.com/inward/record.url?scp=49549105778&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=49549105778&partnerID=8YFLogxK

U2 - 10.1198/016214508000000337

DO - 10.1198/016214508000000337

M3 - Article

AN - SCOPUS:49549105778

VL - 103

SP - 681

EP - 686

JO - Journal of the American Statistical Association

JF - Journal of the American Statistical Association

SN - 0162-1459

IS - 482

ER -