TY - GEN
T1 - Geometric Matrix Completion via Sylvester Multi-Graph Neural Network
AU - Du, Boxin
AU - Wang, Fei
AU - Yuan, Changhe
AU - Tong, Hanghang
N1 - In this paper, we propose SyMGNN, a flexible neural framework for generalizing the traditional Sylvester equation towards an end-to-end neural model for multi-network mining. We further propose two specific instantiations of the SyMGNN framework for geometric matrix completion task. The experimental results show that the proposed models overall outperform baselines on all existing bench-mark datasets. Furthermore, the proposed low-rank instantiation could reduce the memory consumption by 16.98% on average. 6 ACKNOWLEDGEMENT This work is partially supported by DARPA (HR001121C0165), DHS (17STQAC00001-07-00), NIFA (2020-67021-32799), NSF (1947135, 2134079, 1939725, 2316233, and 2324770 ), and ARO (W911NF2110088).
PY - 2023/10/21
Y1 - 2023/10/21
N2 - Despite the success of the Sylvester equation empowered methods on various graph mining applications, such as semi-supervised label learning and network alignment, there also exists several limitations. The Sylvester equation's inability of modeling non-linear relations and the inflexibility of tuning towards different tasks restrict its performance. In this paper, we propose an end-to-end neural framework, SyMGNN which consists of a multi-network neural aggregation module and a prior multi-network association incorporation learning module. The proposed framework inherits the key ideas of the Sylvester equation, and meanwhile generalizes it to overcome aforementioned limitations. Empirical evaluations on real-world datasets show that the instantiations of SyMGNN overall outperform the baselines in geometric matrix completion task, and its low-rank instantiation could further reduce the memory consumption by 16.98% on average.
AB - Despite the success of the Sylvester equation empowered methods on various graph mining applications, such as semi-supervised label learning and network alignment, there also exists several limitations. The Sylvester equation's inability of modeling non-linear relations and the inflexibility of tuning towards different tasks restrict its performance. In this paper, we propose an end-to-end neural framework, SyMGNN which consists of a multi-network neural aggregation module and a prior multi-network association incorporation learning module. The proposed framework inherits the key ideas of the Sylvester equation, and meanwhile generalizes it to overcome aforementioned limitations. Empirical evaluations on real-world datasets show that the instantiations of SyMGNN overall outperform the baselines in geometric matrix completion task, and its low-rank instantiation could further reduce the memory consumption by 16.98% on average.
KW - Graph Neural Networks
KW - Sylvester equation
KW - matrix completion
UR - http://www.scopus.com/inward/record.url?scp=85178101828&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85178101828&partnerID=8YFLogxK
U2 - 10.1145/3583780.3615170
DO - 10.1145/3583780.3615170
M3 - Conference contribution
AN - SCOPUS:85178101828
T3 - International Conference on Information and Knowledge Management, Proceedings
SP - 3860
EP - 3864
BT - CIKM 2023 - Proceedings of the 32nd ACM International Conference on Information and Knowledge Management
PB - Association for Computing Machinery
T2 - 32nd ACM International Conference on Information and Knowledge Management, CIKM 2023
Y2 - 21 October 2023 through 25 October 2023
ER -