Covering Number Bounds of Certain Regularized Linear Function Classes

Research output: Contribution to journalArticlepeer-review

Abstract

Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. In many of these theoretical studies, the concept of covering numbers played an important role. It is thus useful to study covering numbers for linear function classes. In this paper, we investigate two closely related methods to derive upper bounds on these covering numbers. The first method, already employed in some earlier studies, relies on the so-called Maurey's lemma; the second method uses techniques from the mistake bound framework in online learning. We compare results from these two methods, as well as their consequences in some learning formulations.

Original languageEnglish (US)
Pages (from-to)527-550
Number of pages24
JournalJournal of Machine Learning Research
Volume2
Issue number3
DOIs
StatePublished - 2002
Externally publishedYes

Keywords

  • Covering Numbers
  • Learning Sample Complexity
  • Mistake Bounds
  • Sparse Approximation

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Statistics and Probability
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Covering Number Bounds of Certain Regularized Linear Function Classes'. Together they form a unique fingerprint.

Cite this