Greedy algorithms for classification - Consistency, convergence rates, and adaptivity

Shie Mannor, Ron Meir, Tong Zhang

Research output: Contribution to journalArticlepeer-review

Abstract

Many regression and classification algorithms proposed over the years can be described as greedy procedures for the stagewise minimization of an appropriate cost function. Some examples include additive models, matching pursuit, and boosting. In this work we focus on the classification problem, for which many recent algorithms have been proposed and applied successfully. For a specific regularized form of greedy stagewise optimization, we prove consistency of the approach under rather general conditions. Focusing on specific classes of problems we provide conditions under which our greedy procedure achieves the (nearly) minimax rate of convergence, implying that the procedure cannot be improved in a worst case setting. We also construct a fully adaptive procedure, which, without knowing the smoothness parameter of the decision boundary, converges at the same rate as if the smoothness parameter were known.

Original languageEnglish (US)
Pages (from-to)713-742
Number of pages30
JournalJournal of Machine Learning Research
Volume4
Issue number4
DOIs
StatePublished - May 15 2004
Externally publishedYes

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Greedy algorithms for classification - Consistency, convergence rates, and adaptivity'. Together they form a unique fingerprint.

Cite this