Dynamic transfer learning with progressive meta-task scheduler

Jun Wu, Jingrui He

Research output: Contribution to journalArticlepeer-review


Dynamic transfer learning refers to the knowledge transfer from a static source task with adequate label information to a dynamic target task with little or no label information. However, most existing theoretical studies and practical algorithms of dynamic transfer learning assume that the target task is continuously evolving over time. This strong assumption is often violated in real world applications, e.g., the target distribution is suddenly changing at some time stamp. To solve this problem, in this paper, we propose a novel meta-learning framework L2S based on a progressive meta-task scheduler for dynamic transfer learning. The crucial idea of L2S is to incrementally learn to schedule the meta-pairs of tasks and then learn the optimal model initialization from those meta-pairs of tasks for fast adaptation to the newest target task. The effectiveness of our L2S framework is verified both theoretically and empirically.

Original languageEnglish (US)
Article number1052972
JournalFrontiers in Big Data
StatePublished - Nov 3 2022


  • distribution shift
  • dynamic environment
  • image classification
  • meta-learning
  • task scheduler
  • transfer learning

ASJC Scopus subject areas

  • Computer Science (miscellaneous)
  • Artificial Intelligence
  • Information Systems


Dive into the research topics of 'Dynamic transfer learning with progressive meta-task scheduler'. Together they form a unique fingerprint.

Cite this