Self-supervised attention flow for dialogue state tracking

Boyuan Pan, Yazheng Yang, Bo Li, Deng Cai

Research output: Contribution to journalArticlepeer-review


The performance of existing approaches for dialogue state tracking (DST) is often limited by the deficiency of labeled datasets, and inefficient utilization of data is also a practical yet tough problem of the DST task. In this paper, we aim to tackle these challenges in a self-supervised manner by introducing an auxiliary pre-training task that learns to pick up the correct dialogue response from a group of candidates. Moreover, we propose an attention flow mechanism that is augmented with a soft-threshold function in a dynamic way to better understand the user intent and filter out the redundant information. Extensive experiments on the multi-domain dialogue state tracking dataset MultiWOZ 2.1 demonstrate the effectiveness of our proposed method, and we also show that it is able to adapt to zero/few-shot cases under the proposed self-supervised framework.

Original languageEnglish (US)
Pages (from-to)279-286
Number of pages8
StatePublished - Jun 14 2021


  • Attention mechanism
  • Dialogue state tracking
  • Self-supervised learning

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence


Dive into the research topics of 'Self-supervised attention flow for dialogue state tracking'. Together they form a unique fingerprint.

Cite this