Efficient multitask feature and relationship learning

Han Zhao, Otilia Stretcu, Alexander J. Smola, Geoffrey J. Gordon

Research output: Contribution to conferencePaperpeer-review


We consider a multitask learning problem, in which several predictors are learned jointly. Prior research has shown that learning the relations between tasks, and between the input features, together with the predictor, can lead to better generalization and interpretability, which proved to be useful for applications in many domains. In this paper, we consider a formulation of multitask learning that learns the relationships both between tasks and between features, represented through a task covariance and a feature covariance matrix, respectively. First, we demonstrate that existing methods proposed for this problem present an issue that may lead to ill-posed optimization. We then propose an alternative formulation, as well as an efficient algorithm to optimize it. Using ideas from optimization and graph theory, we propose an efficient coordinate-wise minimization algorithm that has a closed form solution for each block subproblem. Our experiments show that the proposed optimization method is orders of magnitude faster than its competitors. We also provide a nonlinear extension that is able to achieve better generalization than existing methods.

Original languageEnglish (US)
StatePublished - 2019
Externally publishedYes
Event35th Conference on Uncertainty in Artificial Intelligence, UAI 2019 - Tel Aviv, Israel
Duration: Jul 22 2019Jul 25 2019


Conference35th Conference on Uncertainty in Artificial Intelligence, UAI 2019
CityTel Aviv

ASJC Scopus subject areas

  • Artificial Intelligence


Dive into the research topics of 'Efficient multitask feature and relationship learning'. Together they form a unique fingerprint.

Cite this