TY - GEN
T1 - Masked Graph Transformer for Large-Scale Recommendation
AU - Chen, Huiyuan
AU - Xu, Zhe
AU - Yeh, Chin Chia Michael
AU - Lai, Vivian
AU - Zheng, Yan
AU - Xu, Minghua
AU - Tong, Hanghang
N1 - Publisher Copyright:
© 2024 ACM.
PY - 2024/7/11
Y1 - 2024/7/11
N2 - Graph Transformers have garnered significant attention for learning graph-structured data, thanks to their superb ability to capture long-range dependencies among nodes. However, the quadratic space and time complexity hinders the scalability of Graph Transformers, particularly for large-scale recommendation. Here we propose an efficient Masked Graph Transformer, named MGFormer, capable of capturing all-pair interactions among nodes with a linear complexity. To achieve this, we treat all user/item nodes as independent tokens, enhance them with positional embeddings, and feed them into a kernelized attention module. Additionally, we incorporate learnable relative degree information to appropriately reweigh the attentions. Experimental results show the superior performance of our MGFormer, even with a single attention layer.
AB - Graph Transformers have garnered significant attention for learning graph-structured data, thanks to their superb ability to capture long-range dependencies among nodes. However, the quadratic space and time complexity hinders the scalability of Graph Transformers, particularly for large-scale recommendation. Here we propose an efficient Masked Graph Transformer, named MGFormer, capable of capturing all-pair interactions among nodes with a linear complexity. To achieve this, we treat all user/item nodes as independent tokens, enhance them with positional embeddings, and feed them into a kernelized attention module. Additionally, we incorporate learnable relative degree information to appropriately reweigh the attentions. Experimental results show the superior performance of our MGFormer, even with a single attention layer.
KW - graph transformer
KW - linear attention
KW - masked mechanism
UR - http://www.scopus.com/inward/record.url?scp=85200599272&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85200599272&partnerID=8YFLogxK
U2 - 10.1145/3626772.3657971
DO - 10.1145/3626772.3657971
M3 - Conference contribution
AN - SCOPUS:85200599272
T3 - SIGIR 2024 - Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval
SP - 2502
EP - 2506
BT - SIGIR 2024 - Proceedings of the 47th International ACM SIGIR Conference on Research and Development in Information Retrieval
PB - Association for Computing Machinery
T2 - 47th International ACM SIGIR Conference on Research and Development in Information Retrieval, SIGIR 2024
Y2 - 14 July 2024 through 18 July 2024
ER -