Multi-agent dual learning

Yiren Wang, Yingce Xia, Tianyu He, Fei Tian, Tao Qin, Chengxiang Zhai, Tie Yan Liu

Research output: Contribution to conferencePaper

Abstract

Dual learning has attracted much attention in machine learning, computer vision and natural language processing communities. The core idea of dual learning is to leverage the duality between the primal task (mapping from domain X to domain Y) and dual task (mapping from domain Y to X) to boost the performances of both tasks. Existing dual learning framework forms a system with two agents (one primal model and one dual model) to utilize such duality. In this paper, we extend this framework by introducing multiple primal and dual models, and propose the multi-agent dual learning framework. Experiments on neural machine translation and image translation tasks demonstrate the effectiveness of the new framework. In particular, we set a new record on IWSLT 2014 German-to-English translation with a 35.44 BLEU score, achieve a 31.03 BLEU score on WMT 2014 English-to-German translation with over 2.6 BLEU improvement over the strong Transformer baseline, and set a new record of 49.61 BLEU score on the recent WMT 2018 English-to-German translation.

Original languageEnglish (US)
StatePublished - Jan 1 2019
Event7th International Conference on Learning Representations, ICLR 2019 - New Orleans, United States
Duration: May 6 2019May 9 2019

Conference

Conference7th International Conference on Learning Representations, ICLR 2019
CountryUnited States
CityNew Orleans
Period5/6/195/9/19

Fingerprint

learning
Computer vision
Learning systems
Processing
Experiments
experiment
language
community
performance
Duality
Experiment
English Translation
Natural Language Processing
Machine Translation
Computer Vision
Dual Task
Machine Learning

ASJC Scopus subject areas

  • Education
  • Computer Science Applications
  • Linguistics and Language
  • Language and Linguistics

Cite this

Wang, Y., Xia, Y., He, T., Tian, F., Qin, T., Zhai, C., & Liu, T. Y. (2019). Multi-agent dual learning. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.

Multi-agent dual learning. / Wang, Yiren; Xia, Yingce; He, Tianyu; Tian, Fei; Qin, Tao; Zhai, Chengxiang; Liu, Tie Yan.

2019. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.

Research output: Contribution to conferencePaper

Wang, Y, Xia, Y, He, T, Tian, F, Qin, T, Zhai, C & Liu, TY 2019, 'Multi-agent dual learning' Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States, 5/6/19 - 5/9/19, .
Wang Y, Xia Y, He T, Tian F, Qin T, Zhai C et al. Multi-agent dual learning. 2019. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.
Wang, Yiren ; Xia, Yingce ; He, Tianyu ; Tian, Fei ; Qin, Tao ; Zhai, Chengxiang ; Liu, Tie Yan. / Multi-agent dual learning. Paper presented at 7th International Conference on Learning Representations, ICLR 2019, New Orleans, United States.
@conference{35b88de97ddf4e8881e61fb9820e8fa5,
title = "Multi-agent dual learning",
abstract = "Dual learning has attracted much attention in machine learning, computer vision and natural language processing communities. The core idea of dual learning is to leverage the duality between the primal task (mapping from domain X to domain Y) and dual task (mapping from domain Y to X) to boost the performances of both tasks. Existing dual learning framework forms a system with two agents (one primal model and one dual model) to utilize such duality. In this paper, we extend this framework by introducing multiple primal and dual models, and propose the multi-agent dual learning framework. Experiments on neural machine translation and image translation tasks demonstrate the effectiveness of the new framework. In particular, we set a new record on IWSLT 2014 German-to-English translation with a 35.44 BLEU score, achieve a 31.03 BLEU score on WMT 2014 English-to-German translation with over 2.6 BLEU improvement over the strong Transformer baseline, and set a new record of 49.61 BLEU score on the recent WMT 2018 English-to-German translation.",
author = "Yiren Wang and Yingce Xia and Tianyu He and Fei Tian and Tao Qin and Chengxiang Zhai and Liu, {Tie Yan}",
year = "2019",
month = "1",
day = "1",
language = "English (US)",
note = "7th International Conference on Learning Representations, ICLR 2019 ; Conference date: 06-05-2019 Through 09-05-2019",

}

TY - CONF

T1 - Multi-agent dual learning

AU - Wang, Yiren

AU - Xia, Yingce

AU - He, Tianyu

AU - Tian, Fei

AU - Qin, Tao

AU - Zhai, Chengxiang

AU - Liu, Tie Yan

PY - 2019/1/1

Y1 - 2019/1/1

N2 - Dual learning has attracted much attention in machine learning, computer vision and natural language processing communities. The core idea of dual learning is to leverage the duality between the primal task (mapping from domain X to domain Y) and dual task (mapping from domain Y to X) to boost the performances of both tasks. Existing dual learning framework forms a system with two agents (one primal model and one dual model) to utilize such duality. In this paper, we extend this framework by introducing multiple primal and dual models, and propose the multi-agent dual learning framework. Experiments on neural machine translation and image translation tasks demonstrate the effectiveness of the new framework. In particular, we set a new record on IWSLT 2014 German-to-English translation with a 35.44 BLEU score, achieve a 31.03 BLEU score on WMT 2014 English-to-German translation with over 2.6 BLEU improvement over the strong Transformer baseline, and set a new record of 49.61 BLEU score on the recent WMT 2018 English-to-German translation.

AB - Dual learning has attracted much attention in machine learning, computer vision and natural language processing communities. The core idea of dual learning is to leverage the duality between the primal task (mapping from domain X to domain Y) and dual task (mapping from domain Y to X) to boost the performances of both tasks. Existing dual learning framework forms a system with two agents (one primal model and one dual model) to utilize such duality. In this paper, we extend this framework by introducing multiple primal and dual models, and propose the multi-agent dual learning framework. Experiments on neural machine translation and image translation tasks demonstrate the effectiveness of the new framework. In particular, we set a new record on IWSLT 2014 German-to-English translation with a 35.44 BLEU score, achieve a 31.03 BLEU score on WMT 2014 English-to-German translation with over 2.6 BLEU improvement over the strong Transformer baseline, and set a new record of 49.61 BLEU score on the recent WMT 2018 English-to-German translation.

UR - http://www.scopus.com/inward/record.url?scp=85071177902&partnerID=8YFLogxK

UR - http://www.scopus.com/inward/citedby.url?scp=85071177902&partnerID=8YFLogxK

M3 - Paper

AN - SCOPUS:85071177902

ER -