TY - GEN
T1 - EventKE
T2 - 2021 Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021
AU - Zhang, Zixuan
AU - Wang, Hongwei
AU - Zhao, Han
AU - Tong, Hanghang
AU - Ji, Heng
N1 - Funding Information:
This research is based upon work supported by U.S. DARPA KAIROS Program No. FA8750-19-2-1004, U.S. DARPA AIDA Program No. FA8750-18-2-0014, Air Force No. FA8650-17-C-7715, LORELEI Program No. HR0011-15-C-0115, the Office of the Director of National Intelligence (ODNI), Intelligence Advanced Research Projects Activity (IARPA), via contract No. FA8650-17-C-9116. The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official policies, either expressed or implied, of DARPA, ODNI, IARPA, or the U.S. Government. The U.S. Government is authorized to reproduce and distribute reprints for governmental purposes notwithstanding any copyright annotation therein.
Publisher Copyright:
© 2021 Association for Computational Linguistics.
PY - 2021
Y1 - 2021
N2 - Relations in most of the traditional knowledge graphs (KGs) only reflect static and factual connections, but fail to represent the dynamic activities and state changes about entities. In this paper, we emphasize the importance of incorporating events in KG representation learning, and propose an event-enhanced KG embedding model EventKE. Specifically, given the original KG, we first incorporate event nodes by building a heterogeneous network, where entity nodes and event nodes are distributed on the two sides of the network interconnected by event argument links. We then use entity-entity relations from the original KG and event-event temporal links to innerconnect entity and event nodes respectively. We design a novel and effective attentionbased message passing method, which is conducted on entity-entity, event-entity, and eventevent relations to fuse the event information into KG embeddings. Experimental results on real-world datasets demonstrate that events can greatly improve the quality of the KG embeddings on multiple downstream tasks.
AB - Relations in most of the traditional knowledge graphs (KGs) only reflect static and factual connections, but fail to represent the dynamic activities and state changes about entities. In this paper, we emphasize the importance of incorporating events in KG representation learning, and propose an event-enhanced KG embedding model EventKE. Specifically, given the original KG, we first incorporate event nodes by building a heterogeneous network, where entity nodes and event nodes are distributed on the two sides of the network interconnected by event argument links. We then use entity-entity relations from the original KG and event-event temporal links to innerconnect entity and event nodes respectively. We design a novel and effective attentionbased message passing method, which is conducted on entity-entity, event-entity, and eventevent relations to fuse the event information into KG embeddings. Experimental results on real-world datasets demonstrate that events can greatly improve the quality of the KG embeddings on multiple downstream tasks.
UR - http://www.scopus.com/inward/record.url?scp=85129144330&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85129144330&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:85129144330
T3 - Findings of the Association for Computational Linguistics, Findings of ACL: EMNLP 2021
SP - 1389
EP - 1400
BT - Findings of the Association for Computational Linguistics, Findings of ACL
A2 - Moens, Marie-Francine
A2 - Huang, Xuanjing
A2 - Specia, Lucia
A2 - Yih, Scott Wen-Tau
PB - Association for Computational Linguistics (ACL)
Y2 - 7 November 2021 through 11 November 2021
ER -