Abstract
The celebrated Bernstein von-Mises theorem ensures credible regions from a Bayesian posterior to be well-calibrated when the model is correctly-specified, in the frequentist sense that their coverage probabilities tend to the nominal values as data accrue. However, this conventional Bayesian framework is known to lack robustness when the model is misspecified or partly specified, for example, in quantile regression, risk minimization based supervised/unsupervised learning and robust estimation. To alleviate this limitation, we propose a new Bayesian inferential approach that substitutes the (misspecified or partly specified) likelihoods with proper exponentially tilted empirical likelihoods plus a regularization term. Our surrogate empirical likelihood is carefully constructed by using the first-order optimality condition of empirical risk minimization as the moment condition. We show that the Bayesian posterior obtained by combining this surrogate empirical likelihood and a prior is asymptotically close to a normal distribution centering at the empirical risk minimizer with an appropriate sandwich-form covariance matrix. Consequently, the resulting Bayesian credible regions are automatically calibrated to deliver valid uncertainty quantification. Computationally, the proposed method can be easily implemented by Markov Chain Monte Carlo sampling algorithms. Our numerical results show that the proposed method tends to be more accurate than existing state-of-the-art competitors.
Original language | English (US) |
---|---|
Pages (from-to) | 1257-1286 |
Number of pages | 30 |
Journal | Journal of the Royal Statistical Society. Series B: Statistical Methodology |
Volume | 84 |
Issue number | 4 |
DOIs | |
State | Published - Sep 2022 |
Keywords
- Bayesian inference
- Gibbs posterior
- exponentially tilted empirical likelihood
- misspecified model
- risk minimization
- robust estimation
ASJC Scopus subject areas
- Statistics and Probability
- Statistics, Probability and Uncertainty