TY - GEN
T1 - Towards Efficient Temporal Graph Learning
T2 - 33rd ACM International Conference on Information and Knowledge Management, CIKM 2024
AU - Wang, Ruijie
AU - Zhao, Wanyu
AU - Sun, Dachun
AU - Mendis, Charith
AU - Abdelzaher, Tarek
N1 - Thiswork was sponsored byDARPAaward HR001121C0165,DARPA award HR00112290105, DoD Basic Research Office award HQ00342110002, the Army Research Laboratory under CooperativeAgreement W911NF- 17-20196, and NSF award CCF-2316233. It was also supported in part by ACE, one of the seven centers in JUMP 2.0, a Semiconductor Research Corporation (SRC) program sponsored by DARPA.
PY - 2024/10/21
Y1 - 2024/10/21
N2 - Temporal graphs capture dynamic node relations via temporal edges, finding extensive utility in wide domains where time-varying patterns are crucial. Temporal Graph Neural Networks (TGNNs) have gained significant attention for their effectiveness in representing temporal graphs. However, TGNNs still face significant efficiency challenges in real-world low-resource settings. First, from a data-efficiency standpoint, training TGNNs requires sufficient temporal edges and data labels, which is problematic in practical scenarios with limited data collection and annotation. Second, from a resource-efficiency perspective, TGNN training and inference are computationally demanding due to complex encoding operations, especially on large-scale temporal graphs. Minimizing resource consumption while preserving effectiveness is essential. Inspired by these efficiency challenges, this tutorial systematically introduces state-of-the-art data-efficient and resource-efficient TGNNs, focusing on algorithms, frameworks, and tools, and discusses promising yet under-explored research directions in efficient temporal graph learning. This tutorial aims to benefit researchers and practitioners in data mining, machine learning, and artificial intelligence.
AB - Temporal graphs capture dynamic node relations via temporal edges, finding extensive utility in wide domains where time-varying patterns are crucial. Temporal Graph Neural Networks (TGNNs) have gained significant attention for their effectiveness in representing temporal graphs. However, TGNNs still face significant efficiency challenges in real-world low-resource settings. First, from a data-efficiency standpoint, training TGNNs requires sufficient temporal edges and data labels, which is problematic in practical scenarios with limited data collection and annotation. Second, from a resource-efficiency perspective, TGNN training and inference are computationally demanding due to complex encoding operations, especially on large-scale temporal graphs. Minimizing resource consumption while preserving effectiveness is essential. Inspired by these efficiency challenges, this tutorial systematically introduces state-of-the-art data-efficient and resource-efficient TGNNs, focusing on algorithms, frameworks, and tools, and discusses promising yet under-explored research directions in efficient temporal graph learning. This tutorial aims to benefit researchers and practitioners in data mining, machine learning, and artificial intelligence.
KW - data-efficient learning
KW - graph neural networks
KW - resource-efficient learning
KW - temporal graphs
UR - http://www.scopus.com/inward/record.url?scp=85209994213&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85209994213&partnerID=8YFLogxK
U2 - 10.1145/3627673.3679104
DO - 10.1145/3627673.3679104
M3 - Conference contribution
AN - SCOPUS:85209994213
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 5530
EP - 5533
BT - CIKM 2024 - Proceedings of the 33rd ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery
Y2 - 21 October 2024 through 25 October 2024
ER -