Active Learning on Heterogeneous Information Networks: A Multi-armed Bandit Approach

Doris Xin, Ahmed El-Kishky, De Liao, Brandon Norick, Jiawei Han

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Active learning exploits inherent structures in the unlabeled data to minimize the number of labels required to train an accurate model. It enables effective machine learning in applications with high labeling cost, such as document classification and drug response prediction. We investigate active learning on heterogeneous information networks, with the objective of obtaining accurate node classifications while minimizing the number of labeled nodes. Our proposed algorithm harnesses a multi-armed bandit (MAB) algorithm to determine network structures that identify the most important nodes to the classification task, accounting for node types and without assuming label assortativity. Evaluations on real-world network classification tasks demonstrate that our algorithm outperforms existing methods independent of the underlying classification model.

Original languageEnglish (US)
Title of host publication2018 IEEE International Conference on Data Mining, ICDM 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages1350-1355
Number of pages6
ISBN (Electronic)9781538691588
DOIs
StatePublished - Dec 27 2018
Event18th IEEE International Conference on Data Mining, ICDM 2018 - Singapore, Singapore
Duration: Nov 17 2018Nov 20 2018

Publication series

NameProceedings - IEEE International Conference on Data Mining, ICDM
Volume2018-November
ISSN (Print)1550-4786

Conference

Conference18th IEEE International Conference on Data Mining, ICDM 2018
Country/TerritorySingapore
CitySingapore
Period11/17/1811/20/18

Keywords

  • Active learning
  • Heterogeneous information networks
  • Multi armed bandit

ASJC Scopus subject areas

  • General Engineering

Fingerprint

Dive into the research topics of 'Active Learning on Heterogeneous Information Networks: A Multi-armed Bandit Approach'. Together they form a unique fingerprint.

Cite this