Quantite Stein variational gradient descent for batch Bayesian optimization

Chengyue Gong, Jian Peng, Qiang Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Batch Bayesian optimization has been shown to be an efficient and successful approach for black-box function optimization, especially when the evaluation of cost function is highly expensive but can be efficiently parallelized. In this paper, we introduce a novel variational framework for batch query optimization, based on the argument that the query batch should be selected to have both high diversity and good worst case performance. This motivates us to introduce a variational objective that combines a quantile-based risk measure (for worst case performance) and entropy regularization (for enforcing diversity). We derive a gradient-based particle optimization algorithm for solving our quantile-based variational objective, which generalizes Stein variational gradient descent (SVGD) by Liu & Wang (2016). We evaluate our method on a number of real-world applications, and show that it consistently outperforms other recent state-of-the-art batch Bayesian optimization methods.

Original languageEnglish (US)
Title of host publication36th International Conference on Machine Learning, ICML 2019
PublisherInternational Machine Learning Society (IMLS)
Pages4212-4221
Number of pages10
ISBN (Electronic)9781510886988
StatePublished - 2019
Event36th International Conference on Machine Learning, ICML 2019 - Long Beach, United States
Duration: Jun 9 2019Jun 15 2019

Publication series

Name36th International Conference on Machine Learning, ICML 2019
Volume2019-June

Conference

Conference36th International Conference on Machine Learning, ICML 2019
CountryUnited States
CityLong Beach
Period6/9/196/15/19

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Human-Computer Interaction

Fingerprint Dive into the research topics of 'Quantite Stein variational gradient descent for batch Bayesian optimization'. Together they form a unique fingerprint.

  • Cite this

    Gong, C., Peng, J., & Liu, Q. (2019). Quantite Stein variational gradient descent for batch Bayesian optimization. In 36th International Conference on Machine Learning, ICML 2019 (pp. 4212-4221). (36th International Conference on Machine Learning, ICML 2019; Vol. 2019-June). International Machine Learning Society (IMLS).