α-variational inference with statistical guarantees

Yun Yang, Debdeep Pati, Anirban Bhattacharya

Research output: Contribution to journalArticlepeer-review

Abstract

We provide statistical guarantees for a family of variational approximations to Bayesian posterior distributions, called α-VB, which has close connections with variational approximations of tempered posteriors in the literature. The standard variational approximation is a special case of α-VB with α = 1. When α ∈ (0, 1], a novel class of variational inequalities are developed for linking the Bayes risk under the variational approximation to the objective function in the variational optimization problem, implying that maximizing the evidence lower bound in variational inference has the effect of minimizing the Bayes risk within the variational density family. Operating in a frequentist setup, the variational inequalities imply that point estimates constructed from the α-VB procedure converge at an optimal rate to the true parameter in a wide range of problems. We illustrate our general theory with a number of examples, including the mean-field variational approximation to (low)-high-dimensional Bayesian linear regression with spike and slab priors, Gaussian mixture models and latent Dirichlet allocation.

Original languageEnglish (US)
Pages (from-to)886-905
Number of pages20
JournalAnnals of Statistics
Volume48
Issue number2
DOIs
StatePublished - 2020

Keywords

  • Bayes risk
  • Evidence lower bound
  • Latent variable models
  • Rényi divergence
  • Variational inference

ASJC Scopus subject areas

  • Statistics and Probability
  • Statistics, Probability and Uncertainty

Fingerprint

Dive into the research topics of 'α-variational inference with statistical guarantees'. Together they form a unique fingerprint.

Cite this