Provable Bayesian inference via particle mirror descent

Bo Dai, Niao He, Hanjun Dai, Le Song

Research output: Contribution to conferencePaper

Abstract

Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters. However, when Bayes’ rule does not result in tractable closed-form, most approximate inference algorithms lack either scalability or rigorous guarantees. To tackle this challenge, we propose a simple yet provable algorithm, Particle Mirror Descent (PMD), to iteratively approximate the posterior density. PMD is inspired by stochastic functional mirror descent where one descends in the density space using a small batch of data points at each iteration, and by particle filtering where one uses samples to approximate a function. We prove result of the first kind that, with m particles, PMD provides a posterior density estimator that converges in terms of KL-divergence to the true posterior in rate O(1/√m). We demonstrate competitive empirical performances of PMD compared to several approximate inference algorithms in mixture models, logistic regression, sparse Gaussian processes and latent Dirichlet allocation on large scale datasets.

Original languageEnglish (US)
Pages985-994
Number of pages10
StatePublished - Jan 1 2016
Event19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 - Cadiz, Spain
Duration: May 9 2016May 11 2016

Conference

Conference19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016
CountrySpain
CityCadiz
Period5/9/165/11/16

Fingerprint

Bayesian inference
Descent
Mirror
Mirrors
Bayes Rule
Particle Filtering
Density Estimator
Bayesian Methods
Logistic Regression
Mixture Model
Gaussian Process
Dirichlet
Batch
Logistics
Scalability
Divergence
Closed-form
Flexibility
Converge
Iteration

ASJC Scopus subject areas

  • Artificial Intelligence
  • Statistics and Probability

Cite this

Dai, B., He, N., Dai, H., & Song, L. (2016). Provable Bayesian inference via particle mirror descent. 985-994. Paper presented at 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain.

Provable Bayesian inference via particle mirror descent. / Dai, Bo; He, Niao; Dai, Hanjun; Song, Le.

2016. 985-994 Paper presented at 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain.

Research output: Contribution to conferencePaper

Dai, B, He, N, Dai, H & Song, L 2016, 'Provable Bayesian inference via particle mirror descent' Paper presented at 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain, 5/9/16 - 5/11/16, pp. 985-994.
Dai B, He N, Dai H, Song L. Provable Bayesian inference via particle mirror descent. 2016. Paper presented at 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain.
Dai, Bo ; He, Niao ; Dai, Hanjun ; Song, Le. / Provable Bayesian inference via particle mirror descent. Paper presented at 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain.10 p.
@conference{51dcdbf3ffd74eb899e000709121f4ec,
title = "Provable Bayesian inference via particle mirror descent",
abstract = "Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters. However, when Bayes’ rule does not result in tractable closed-form, most approximate inference algorithms lack either scalability or rigorous guarantees. To tackle this challenge, we propose a simple yet provable algorithm, Particle Mirror Descent (PMD), to iteratively approximate the posterior density. PMD is inspired by stochastic functional mirror descent where one descends in the density space using a small batch of data points at each iteration, and by particle filtering where one uses samples to approximate a function. We prove result of the first kind that, with m particles, PMD provides a posterior density estimator that converges in terms of KL-divergence to the true posterior in rate O(1/√m). We demonstrate competitive empirical performances of PMD compared to several approximate inference algorithms in mixture models, logistic regression, sparse Gaussian processes and latent Dirichlet allocation on large scale datasets.",
author = "Bo Dai and Niao He and Hanjun Dai and Le Song",
year = "2016",
month = "1",
day = "1",
language = "English (US)",
pages = "985--994",
note = "19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 ; Conference date: 09-05-2016 Through 11-05-2016",

}

TY - CONF

T1 - Provable Bayesian inference via particle mirror descent

AU - Dai, Bo

AU - He, Niao

AU - Dai, Hanjun

AU - Song, Le

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters. However, when Bayes’ rule does not result in tractable closed-form, most approximate inference algorithms lack either scalability or rigorous guarantees. To tackle this challenge, we propose a simple yet provable algorithm, Particle Mirror Descent (PMD), to iteratively approximate the posterior density. PMD is inspired by stochastic functional mirror descent where one descends in the density space using a small batch of data points at each iteration, and by particle filtering where one uses samples to approximate a function. We prove result of the first kind that, with m particles, PMD provides a posterior density estimator that converges in terms of KL-divergence to the true posterior in rate O(1/√m). We demonstrate competitive empirical performances of PMD compared to several approximate inference algorithms in mixture models, logistic regression, sparse Gaussian processes and latent Dirichlet allocation on large scale datasets.

AB - Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters. However, when Bayes’ rule does not result in tractable closed-form, most approximate inference algorithms lack either scalability or rigorous guarantees. To tackle this challenge, we propose a simple yet provable algorithm, Particle Mirror Descent (PMD), to iteratively approximate the posterior density. PMD is inspired by stochastic functional mirror descent where one descends in the density space using a small batch of data points at each iteration, and by particle filtering where one uses samples to approximate a function. We prove result of the first kind that, with m particles, PMD provides a posterior density estimator that converges in terms of KL-divergence to the true posterior in rate O(1/√m). We demonstrate competitive empirical performances of PMD compared to several approximate inference algorithms in mixture models, logistic regression, sparse Gaussian processes and latent Dirichlet allocation on large scale datasets.

UR - http://www.scopus.com/inward/record.url?scp=85047007442&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85047007442&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85047007442

SP - 985

EP - 994

ER -