TY - GEN
T1 - Agnostic active learning without constraints
AU - Beygelzimer, Alina
AU - Hsu, Daniel
AU - Langford, John
AU - Zhang, Tong
PY - 2010
Y1 - 2010
N2 - We present and analyze an agnostic active learning algorithm that works without keeping a version space. This is unlike all previous approaches where a restricted set of candidate hypotheses is maintained throughout learning, and only hypotheses from this set are ever returned. By avoiding this version space approach, our algorithm sheds the computational burden and brittleness associated with maintaining version spaces, yet still allows for substantial improvements over supervised learning for classification.
AB - We present and analyze an agnostic active learning algorithm that works without keeping a version space. This is unlike all previous approaches where a restricted set of candidate hypotheses is maintained throughout learning, and only hypotheses from this set are ever returned. By avoiding this version space approach, our algorithm sheds the computational burden and brittleness associated with maintaining version spaces, yet still allows for substantial improvements over supervised learning for classification.
UR - http://www.scopus.com/inward/record.url?scp=85161966389&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85161966389&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85161966389
SN - 9781617823800
T3 - Advances in Neural Information Processing Systems 23: 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
BT - Advances in Neural Information Processing Systems 23
PB - Neural Information Processing Systems
T2 - 24th Annual Conference on Neural Information Processing Systems 2010, NIPS 2010
Y2 - 6 December 2010 through 9 December 2010
ER -