Structured Variational Inference in Bayesian State-Space Models

Honggang Wang, Yun Yang, Debdeep Pati, Anirban Bhattacharya

Research output: Contribution to journalConference articlepeer-review


Variational inference is routinely deployed in Bayesian state-space models as an efficient computational technique. Motivated by the inconsistency issue observed by Wang and Titterington (Wang & Titterington, 2004) for the mean-field approximation in linear state-space models, we consider a more expressive variational family for approximating the joint posterior of the latent variables to retain their dependence, while maintaining the mean-field (i.e. independence) structure between latent variables and parameters. In state-space models, such a latent structure adapted mean-field approximation can be efficiently computed using the belief propagation algorithm. Theoretically, we show that this adapted mean-field approximation leads to consistent variational estimates. Furthermore, we derive a non-asymptotic risk bound for an averaged α-divergence from the true data generating model, suggesting that the posterior mean of the best variational approximation for the static parameters shows optimal concentration. From a broader perspective, we add to the growing literature on statistical accuracy of variational approximations by allowing dependence between the latent variables, and the techniques developed here should be useful in related contexts.

Original languageEnglish (US)
Pages (from-to)8884-8905
Number of pages22
JournalProceedings of Machine Learning Research
StatePublished - 2022
Event25th International Conference on Artificial Intelligence and Statistics, AISTATS 2022 - Virtual, Online, Spain
Duration: Mar 28 2022Mar 30 2022

ASJC Scopus subject areas

  • Artificial Intelligence
  • Software
  • Control and Systems Engineering
  • Statistics and Probability


Dive into the research topics of 'Structured Variational Inference in Bayesian State-Space Models'. Together they form a unique fingerprint.

Cite this