On Bayesian bounds

Research output: Chapter in Book/Report/Conference proceedingConference contribution


We show that several important Bayesian bounds studied in machine learning, both in the batch as well as the online setting, arise by an application of a simple compression lemma. In particular, we derive (i) PAC-Bayesian bounds in the batch setting, (ii) Bayesian log-loss bounds and (iii) Bayesian bounded-loss bounds in the online setting using the compression lemma. Although every setting has different semantics for prior, posterior and loss, we show that the core bound argument is the same. The paper simplifies our understanding of several important and apparently disparate results, as well as brings to light a powerful tool for developing similar arguments for other methods.

Original languageEnglish (US)
Title of host publicationACM International Conference Proceeding Series - Proceedings of the 23rd International Conference on Machine Learning, ICML 2006
Number of pages8
StatePublished - 2006
Externally publishedYes
Event23rd International Conference on Machine Learning, ICML 2006 - Pittsburgh, PA, United States
Duration: Jun 25 2006Jun 29 2006

Publication series

NameACM International Conference Proceeding Series


Other23rd International Conference on Machine Learning, ICML 2006
Country/TerritoryUnited States
CityPittsburgh, PA

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction
  • Computer Vision and Pattern Recognition
  • Computer Networks and Communications


Dive into the research topics of 'On Bayesian bounds'. Together they form a unique fingerprint.

Cite this