TY - GEN
T1 - Lower bounds for passive and active learning
AU - Raginsky, Maxim
AU - Rakhlin, Alexander
PY - 2011
Y1 - 2011
N2 - We develop unified information-theoretic machinery for deriving lower bounds for passive and active learning schemes. Our bounds involve the so-called Alexander's capacity function. The supremum of this function has been recently rediscovered by Hanneke in the context of active learning under the name of "disagreement coefficient." For passive learning, our lower bounds match the upper bounds of Giné and Koltchinskii up to constants and generalize analogous results of Massart and Nédélec. For active learning, we provide first known lower bounds based on the capacity function rather than the disagreement coefficient.
AB - We develop unified information-theoretic machinery for deriving lower bounds for passive and active learning schemes. Our bounds involve the so-called Alexander's capacity function. The supremum of this function has been recently rediscovered by Hanneke in the context of active learning under the name of "disagreement coefficient." For passive learning, our lower bounds match the upper bounds of Giné and Koltchinskii up to constants and generalize analogous results of Massart and Nédélec. For active learning, we provide first known lower bounds based on the capacity function rather than the disagreement coefficient.
UR - http://www.scopus.com/inward/record.url?scp=85162323750&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85162323750&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85162323750
SN - 9781618395993
T3 - Advances in Neural Information Processing Systems 24: 25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011
BT - Advances in Neural Information Processing Systems 24
PB - Neural Information Processing Systems
T2 - 25th Annual Conference on Neural Information Processing Systems 2011, NIPS 2011
Y2 - 12 December 2011 through 14 December 2011
ER -