Hierarchical active transfer learning

David Kale, Marjan Ghazvininejad, Anil Ramakrishna, Jingrui He, Yan Liu

Research output: Chapter in Book/Report/Conference proceedingConference contribution


We describe a unified active transfer learning framework called Hierarchical Active Transfer Learning (HATL). HATL exploits cluster structure shared between different data domains to perform transfer learning by imputing labels for unlabeled target data and to generate effective label queries during active learning. The resulting framework is flexible enough to perform not only adaptive transfer learning and accelerated active learning but also unsupervised and semi-supervised transfer learning. We derive an intuitive and useful upper bound on HATL's error when used to infer labels for unlabeled target points. We also present results on synthetic data that confirm both intuition and our analysis. Finally, we demonstrate HATL's empirical effectiveness on a benchmark data set for sentiment classification.

Original languageEnglish (US)
Title of host publicationSIAM International Conference on Data Mining 2015, SDM 2015
EditorsSuresh Venkatasubramanian, Jieping Ye
PublisherSociety for Industrial and Applied Mathematics Publications
Number of pages9
ISBN (Electronic)9781510811522
StatePublished - 2015
Externally publishedYes
EventSIAM International Conference on Data Mining 2015, SDM 2015 - Vancouver, Canada
Duration: Apr 30 2015May 2 2015

Publication series

NameSIAM International Conference on Data Mining 2015, SDM 2015


OtherSIAM International Conference on Data Mining 2015, SDM 2015

ASJC Scopus subject areas

  • Computational Theory and Mathematics
  • Computer Vision and Pattern Recognition
  • Software


Dive into the research topics of 'Hierarchical active transfer learning'. Together they form a unique fingerprint.

Cite this