Describing a knowledge base

Qingyun Wang, Xiaoman Pan, Lifu Huang, Boliang Zhang, Zhiying Jiang, Heng Ji, Kevin Knight

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

We aim to automatically generate natural language descriptions about an input structured knowledge base (KB). We build our generation framework based on a pointer network which can copy facts from the input KB, and add two attention mechanisms: (i) slot-aware attention to capture the association between a slot type and its corresponding slot value; and (ii) a new table position self-attention to capture the inter-dependencies among related slots. For evaluation, besides standard metrics including BLEU, METEOR, and ROUGE, we propose a KB reconstruction based metric by extracting a KB from the generation output and comparing it with the input KB. We also create a new data set which includes 106,216 pairs of structured KBs and their corresponding natural language descriptions for two distinct entity types. Experiments show that our approach significantly outperforms state-of-the-art methods. The reconstructed KB achieves 68.8% - 72.6% F-score.1,.

Original languageEnglish (US)
Title of host publicationINLG 2018 - 11th International Natural Language Generation Conference, Proceedings of the Conference
PublisherAssociation for Computational Linguistics (ACL)
Pages10-21
Number of pages12
ISBN (Electronic)9781948087865
StatePublished - 2018
Externally publishedYes
Event11th International Natural Language Generation Conference, INLG 2018 - Tilburg, Netherlands
Duration: Nov 5 2018Nov 8 2018

Publication series

NameINLG 2018 - 11th International Natural Language Generation Conference, Proceedings of the Conference

Conference

Conference11th International Natural Language Generation Conference, INLG 2018
CountryNetherlands
CityTilburg
Period11/5/1811/8/18

ASJC Scopus subject areas

  • Software

Fingerprint Dive into the research topics of 'Describing a knowledge base'. Together they form a unique fingerprint.

Cite this