### Abstract

Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters. However, when Bayes’ rule does not result in tractable closed-form, most approximate inference algorithms lack either scalability or rigorous guarantees. To tackle this challenge, we propose a simple yet provable algorithm, Particle Mirror Descent (PMD), to iteratively approximate the posterior density. PMD is inspired by stochastic functional mirror descent where one descends in the density space using a small batch of data points at each iteration, and by particle filtering where one uses samples to approximate a function. We prove result of the first kind that, with m particles, PMD provides a posterior density estimator that converges in terms of KL-divergence to the true posterior in rate O(1/√m). We demonstrate competitive empirical performances of PMD compared to several approximate inference algorithms in mixture models, logistic regression, sparse Gaussian processes and latent Dirichlet allocation on large scale datasets.

Original language | English (US) |
---|---|

Pages | 985-994 |

Number of pages | 10 |

State | Published - Jan 1 2016 |

Event | 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 - Cadiz, Spain Duration: May 9 2016 → May 11 2016 |

### Conference

Conference | 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016 |
---|---|

Country | Spain |

City | Cadiz |

Period | 5/9/16 → 5/11/16 |

### Fingerprint

### ASJC Scopus subject areas

- Artificial Intelligence
- Statistics and Probability

### Cite this

*Provable Bayesian inference via particle mirror descent*. 985-994. Paper presented at 19th International Conference on Artificial Intelligence and Statistics, AISTATS 2016, Cadiz, Spain.

**Provable Bayesian inference via particle mirror descent.** / Dai, Bo; He, Niao; Dai, Hanjun; Song, Le.

Research output: Contribution to conference › Paper

}

TY - CONF

T1 - Provable Bayesian inference via particle mirror descent

AU - Dai, Bo

AU - He, Niao

AU - Dai, Hanjun

AU - Song, Le

PY - 2016/1/1

Y1 - 2016/1/1

N2 - Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters. However, when Bayes’ rule does not result in tractable closed-form, most approximate inference algorithms lack either scalability or rigorous guarantees. To tackle this challenge, we propose a simple yet provable algorithm, Particle Mirror Descent (PMD), to iteratively approximate the posterior density. PMD is inspired by stochastic functional mirror descent where one descends in the density space using a small batch of data points at each iteration, and by particle filtering where one uses samples to approximate a function. We prove result of the first kind that, with m particles, PMD provides a posterior density estimator that converges in terms of KL-divergence to the true posterior in rate O(1/√m). We demonstrate competitive empirical performances of PMD compared to several approximate inference algorithms in mixture models, logistic regression, sparse Gaussian processes and latent Dirichlet allocation on large scale datasets.

AB - Bayesian methods are appealing in their flexibility in modeling complex data and ability in capturing uncertainty in parameters. However, when Bayes’ rule does not result in tractable closed-form, most approximate inference algorithms lack either scalability or rigorous guarantees. To tackle this challenge, we propose a simple yet provable algorithm, Particle Mirror Descent (PMD), to iteratively approximate the posterior density. PMD is inspired by stochastic functional mirror descent where one descends in the density space using a small batch of data points at each iteration, and by particle filtering where one uses samples to approximate a function. We prove result of the first kind that, with m particles, PMD provides a posterior density estimator that converges in terms of KL-divergence to the true posterior in rate O(1/√m). We demonstrate competitive empirical performances of PMD compared to several approximate inference algorithms in mixture models, logistic regression, sparse Gaussian processes and latent Dirichlet allocation on large scale datasets.

UR - http://www.scopus.com/inward/record.url?scp=85047007442&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85047007442&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85047007442

SP - 985

EP - 994

ER -