Coupled variational bayes via optimization embedding

Bo Dai, Hanjun Dai, Niao He, Weiyang Liu, Zhen Liu, Jianshu Chen, Lin Xiao, Le Song

Research output: Contribution to journalConference article

Abstract

Variational inference plays a vital role in learning graphical models, especially on large-scale datasets. Much of its success depends on a proper choice of auxiliary distribution class for posterior approximation. However, how to pursue an auxiliary distribution class that achieves both good approximation ability and computation efficiency remains a core challenge. In this paper, we proposed coupled variational Bayes which exploits the primal-dual view of the ELBO with the variational distribution class generated by an optimization procedure, which is termed optimization embedding. This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution. Theoretically, we establish an interesting connection to gradient flow and demonstrate the extreme flexibility of this implicit distribution family in the limit sense. Empirically, we demonstrate the effectiveness of the proposed method on multiple graphical models with either continuous or discrete latent variables comparing to state-of-the-art methods.

Original languageEnglish (US)
Pages (from-to)9690-9700
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - Jan 1 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: Dec 2 2018Dec 8 2018

Fingerprint

Backpropagation

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Cite this

Dai, B., Dai, H., He, N., Liu, W., Liu, Z., Chen, J., ... Song, L. (2018). Coupled variational bayes via optimization embedding. Advances in Neural Information Processing Systems, 2018-December, 9690-9700.

Coupled variational bayes via optimization embedding. / Dai, Bo; Dai, Hanjun; He, Niao; Liu, Weiyang; Liu, Zhen; Chen, Jianshu; Xiao, Lin; Song, Le.

In: Advances in Neural Information Processing Systems, Vol. 2018-December, 01.01.2018, p. 9690-9700.

Research output: Contribution to journalConference article

Dai, B, Dai, H, He, N, Liu, W, Liu, Z, Chen, J, Xiao, L & Song, L 2018, 'Coupled variational bayes via optimization embedding', Advances in Neural Information Processing Systems, vol. 2018-December, pp. 9690-9700.
Dai B, Dai H, He N, Liu W, Liu Z, Chen J et al. Coupled variational bayes via optimization embedding. Advances in Neural Information Processing Systems. 2018 Jan 1;2018-December:9690-9700.
Dai, Bo ; Dai, Hanjun ; He, Niao ; Liu, Weiyang ; Liu, Zhen ; Chen, Jianshu ; Xiao, Lin ; Song, Le. / Coupled variational bayes via optimization embedding. In: Advances in Neural Information Processing Systems. 2018 ; Vol. 2018-December. pp. 9690-9700.
@article{640abad1cd394f5fbc58c5da69f1286a,
title = "Coupled variational bayes via optimization embedding",
abstract = "Variational inference plays a vital role in learning graphical models, especially on large-scale datasets. Much of its success depends on a proper choice of auxiliary distribution class for posterior approximation. However, how to pursue an auxiliary distribution class that achieves both good approximation ability and computation efficiency remains a core challenge. In this paper, we proposed coupled variational Bayes which exploits the primal-dual view of the ELBO with the variational distribution class generated by an optimization procedure, which is termed optimization embedding. This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution. Theoretically, we establish an interesting connection to gradient flow and demonstrate the extreme flexibility of this implicit distribution family in the limit sense. Empirically, we demonstrate the effectiveness of the proposed method on multiple graphical models with either continuous or discrete latent variables comparing to state-of-the-art methods.",
author = "Bo Dai and Hanjun Dai and Niao He and Weiyang Liu and Zhen Liu and Jianshu Chen and Lin Xiao and Le Song",
year = "2018",
month = "1",
day = "1",
language = "English (US)",
volume = "2018-December",
pages = "9690--9700",
journal = "Advances in Neural Information Processing Systems",
issn = "1049-5258",

}

TY - JOUR

T1 - Coupled variational bayes via optimization embedding

AU - Dai, Bo

AU - Dai, Hanjun

AU - He, Niao

AU - Liu, Weiyang

AU - Liu, Zhen

AU - Chen, Jianshu

AU - Xiao, Lin

AU - Song, Le

PY - 2018/1/1

Y1 - 2018/1/1

N2 - Variational inference plays a vital role in learning graphical models, especially on large-scale datasets. Much of its success depends on a proper choice of auxiliary distribution class for posterior approximation. However, how to pursue an auxiliary distribution class that achieves both good approximation ability and computation efficiency remains a core challenge. In this paper, we proposed coupled variational Bayes which exploits the primal-dual view of the ELBO with the variational distribution class generated by an optimization procedure, which is termed optimization embedding. This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution. Theoretically, we establish an interesting connection to gradient flow and demonstrate the extreme flexibility of this implicit distribution family in the limit sense. Empirically, we demonstrate the effectiveness of the proposed method on multiple graphical models with either continuous or discrete latent variables comparing to state-of-the-art methods.

AB - Variational inference plays a vital role in learning graphical models, especially on large-scale datasets. Much of its success depends on a proper choice of auxiliary distribution class for posterior approximation. However, how to pursue an auxiliary distribution class that achieves both good approximation ability and computation efficiency remains a core challenge. In this paper, we proposed coupled variational Bayes which exploits the primal-dual view of the ELBO with the variational distribution class generated by an optimization procedure, which is termed optimization embedding. This flexible function class couples the variational distribution with the original parameters in the graphical models, allowing end-to-end learning of the graphical models by back-propagation through the variational distribution. Theoretically, we establish an interesting connection to gradient flow and demonstrate the extreme flexibility of this implicit distribution family in the limit sense. Empirically, we demonstrate the effectiveness of the proposed method on multiple graphical models with either continuous or discrete latent variables comparing to state-of-the-art methods.

UR - http://www.scopus.com/inward/record.url?scp=85064826080&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85064826080&partnerID=8YFLogxK

M3 - Conference article

AN - SCOPUS:85064826080

VL - 2018-December

SP - 9690

EP - 9700

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

ER -