Boosted Bayesian network classifiers

Yushi Jing, Vladimir Pavlović, James M. Rehg

Research output: Contribution to journalArticlepeer-review


The use of Bayesian networks for classification problems has received a significant amount of recent attention. Although computationally efficient, the standard maximum likelihood learning method tends to be suboptimal due to the mismatch between its optimization criteria (data likelihood) and the actual goal of classification (label prediction accuracy). Recent approaches to optimizing classification performance during parameter or structure learning show promise, but lack the favorable computational properties of maximum likelihood learning. In this paper we present boosted Bayesian network classifiers, a framework to combine discriminative data-weighting with generative training of intermediate models. We show that boosted Bayesian network classifiers encompass the basic generative models in isolation, but improve their classification performance when the model structure is suboptimal. We also demonstrate that structure learning is beneficial in the construction of boosted Bayesian network classifiers. On a large suite of benchmark data-sets, this approach outperforms generative graphical models such as naive Bayes and TAN in classification accuracy. Boosted Bayesian network classifiers have comparable or better performance in comparison to other discriminatively trained graphical models including ELR and BNC. Furthermore, boosted Bayesian networks require significantly less training time than the ELR and BNC algorithms.

Original languageEnglish (US)
Pages (from-to)155-184
Number of pages30
JournalMachine Learning
Issue number2
StatePublished - Nov 2008
Externally publishedYes


  • AdaBoost
  • Bayesian network classifiers
  • Ensemble models
  • Structure learning

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence


Dive into the research topics of 'Boosted Bayesian network classifiers'. Together they form a unique fingerprint.

Cite this