Agnostic active learning without constraints

Alina Beygelzimer, Daniel Hsu, John Langford, Tong Zhang

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We present and analyze an agnostic active learning algorithm that works without keeping a version space. This is unlike all previous approaches where a restricted set of candidate hypotheses is maintained throughout learning, and only hypotheses from this set are ever returned. By avoiding this version space approach, our algorithm sheds the computational burden and brittleness associated with maintaining version spaces, yet still allows for substantial improvements over supervised learning for classification.

Original languageEnglish (US)
Title of host publicationAdvances in Neural Information Processing Systems 23
Subtitle of host publication24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
PublisherNeural Information Processing Systems
ISBN (Print)9781617823800
StatePublished - 2010
Externally publishedYes
Event24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010 - Vancouver, BC, Canada
Duration: Dec 6 2010Dec 9 2010

Publication series

NameAdvances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010

Conference

Conference24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
Country/TerritoryCanada
CityVancouver, BC
Period12/6/1012/9/10

ASJC Scopus subject areas

  • Information Systems

Fingerprint

Dive into the research topics of 'Agnostic active learning without constraints'. Together they form a unique fingerprint.

Cite this