Abstract
We propose a new method under the Bayesian framework to perform valid inference for low dimensional parameters in high dimensional linear models under sparsity constraints. Our approach is to use surrogate Bayesian posteriors based on partial regression models to remove the effect of high dimensional nuisance variables. We name the final distribution we used to conduct inference “conditional Bayesian posterior” as it is a surrogate posterior constructed conditional on quasi posterior distributions of other parameters and does not admit a fully Bayesian interpretation. Un-like existing Bayesian regularization methods, our method can be used to quantify the estimation uncertainty for arbitrarily small signals and there-fore does not require variable selection consistency to guarantee its valid-ity. Theoretically, we show that the resulting Bayesian credible intervals achieve desired coverage probabilities in the frequentist sense. Methodolog-ically, our proposed Bayesian framework can easily incorporate popular Bayesian regularization procedures such as those based on spike and slab priors and horseshoe priors to facilitate high accuracy estimation and infer-ence. Numerically, our proposed method rectifies the uncertainty underes-timation of Bayesian shrinkage approaches and has a comparable empirical performance with state-of-the-art frequentist methods based on extensive simulation studies and a real data analysis.
Original language | English (US) |
---|---|
Pages (from-to) | 769-797 |
Number of pages | 29 |
Journal | Electronic Journal of Statistics |
Volume | 17 |
Issue number | 1 |
DOIs | |
State | Published - 2023 |
Keywords
- Bayesian inference
- Bayesian regularization
- high dimensional linear model
- sparsity
- uncertainty quantification
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty