TY - GEN
T1 - TILM
T2 - 23rd Conference on Computational Natural Language Learning, CoNLL 2019
AU - Santu, Shubhra Kanti Karmaker
AU - Veeramachaneni, Kalyan
AU - Zhai, Cheng Xiang
N1 - Publisher Copyright:
© 2019 Association for Computational Linguistics.
PY - 2019
Y1 - 2019
N2 - Content of text data are often influenced by contextual factors which often evolve over time (e.g., content of social media are often influenced by topics covered in the major news streams). Existing language models do not consider the influence of such related evolving topics, and thus are not optimal. In this paper, we propose to incorporate such topical-influence into a language model to both improve its accuracy and enable cross-stream analysis of topical influences. Specifically, we propose a novel language model called Topical Influence Language Model (TILM), which is a novel extension of a neural language model to capture the influences on the contents in one text stream by the evolving topics in another related (or possibly same) text stream. Experimental results on six different text stream data comprised of conference paper titles show that the incorporation of evolving topical influence into a language model is beneficial and TILM outperforms multiple baselines in a challenging task of text forecasting. In addition to serving as a language model, TILM further enables interesting analysis of topical influence among multiple text streams.
AB - Content of text data are often influenced by contextual factors which often evolve over time (e.g., content of social media are often influenced by topics covered in the major news streams). Existing language models do not consider the influence of such related evolving topics, and thus are not optimal. In this paper, we propose to incorporate such topical-influence into a language model to both improve its accuracy and enable cross-stream analysis of topical influences. Specifically, we propose a novel language model called Topical Influence Language Model (TILM), which is a novel extension of a neural language model to capture the influences on the contents in one text stream by the evolving topics in another related (or possibly same) text stream. Experimental results on six different text stream data comprised of conference paper titles show that the incorporation of evolving topical influence into a language model is beneficial and TILM outperforms multiple baselines in a challenging task of text forecasting. In addition to serving as a language model, TILM further enables interesting analysis of topical influence among multiple text streams.
UR - http://www.scopus.com/inward/record.url?scp=85084341890&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85084341890&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85084341890
T3 - CoNLL 2019 - 23rd Conference on Computational Natural Language Learning, Proceedings of the Conference
SP - 778
EP - 788
BT - CoNLL 2019 - 23rd Conference on Computational Natural Language Learning, Proceedings of the Conference
PB - Association for Computational Linguistics
Y2 - 3 November 2019 through 4 November 2019
ER -