A Bayesian Lasso via reversible-jump MCMC

Xiaohui Chen, Z. Jane Wang, Martin J. McKeown

Research output: Contribution to journalArticle

Abstract

Variable selection is a topic of great importance in high-dimensional statistical modeling and has a wide range of real-world applications. Many variable selection techniques have been proposed in the context of linear regression, and the Lasso model is probably one of the most popular penalized regression techniques. In this paper, we propose a new, fully hierarchical, Bayesian version of the Lasso model by employing flexible sparsity promoting priors. To obtain the Bayesian Lasso estimate, a reversible-jump MCMC algorithm is developed for joint posterior inference over both discrete and continuous parameter spaces. Simulations demonstrate that the proposed RJ-MCMC-based Bayesian Lasso yields smaller estimation errors and more accurate sparsity pattern detection when compared with state-of-the-art optimization-based Lasso-type methods, a standard Gibbs sampler-based Bayesian Lasso and the Binomial-Gaussian prior model. To demonstrate the applicability and estimation stability of the proposed Bayesian Lasso, we examine a benchmark diabetes data set and real functional Magnetic Resonance Imaging data. As an extension of the proposed RJ-MCMC framework, we also develop an MCMC-based algorithm for the Binomial-Gaussian prior model and illustrate its improved performance over the non-Bayesian estimate via simulations.

Original languageEnglish (US)
Pages (from-to)1920-1932
Number of pages13
JournalSignal Processing
Volume91
Issue number8
DOIs
StatePublished - Aug 1 2011

    Fingerprint

Keywords

  • Dimensionality reduction
  • Fully Bayesian modeling
  • Lasso
  • Reversible-jump Markov chain Monte Carlo (RJ-MCMC)
  • Sparse signal recovery
  • Variable selection

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering

Cite this