Abstract
Variable selection is a topic of great importance in high-dimensional statistical modeling and has a wide range of real-world applications. Many variable selection techniques have been proposed in the context of linear regression, and the Lasso model is probably one of the most popular penalized regression techniques. In this paper, we propose a new, fully hierarchical, Bayesian version of the Lasso model by employing flexible sparsity promoting priors. To obtain the Bayesian Lasso estimate, a reversible-jump MCMC algorithm is developed for joint posterior inference over both discrete and continuous parameter spaces. Simulations demonstrate that the proposed RJ-MCMC-based Bayesian Lasso yields smaller estimation errors and more accurate sparsity pattern detection when compared with state-of-the-art optimization-based Lasso-type methods, a standard Gibbs sampler-based Bayesian Lasso and the Binomial-Gaussian prior model. To demonstrate the applicability and estimation stability of the proposed Bayesian Lasso, we examine a benchmark diabetes data set and real functional Magnetic Resonance Imaging data. As an extension of the proposed RJ-MCMC framework, we also develop an MCMC-based algorithm for the Binomial-Gaussian prior model and illustrate its improved performance over the non-Bayesian estimate via simulations.
Original language | English (US) |
---|---|
Pages (from-to) | 1920-1932 |
Number of pages | 13 |
Journal | Signal Processing |
Volume | 91 |
Issue number | 8 |
DOIs | |
State | Published - Aug 2011 |
Externally published | Yes |
Keywords
- Dimensionality reduction
- Fully Bayesian modeling
- Lasso
- Reversible-jump Markov chain Monte Carlo (RJ-MCMC)
- Sparse signal recovery
- Variable selection
ASJC Scopus subject areas
- Control and Systems Engineering
- Software
- Signal Processing
- Computer Vision and Pattern Recognition
- Electrical and Electronic Engineering