TY - JOUR
T1 - Generative Graph Dictionary Learning
AU - Zeng, Zhichen
AU - Zhu, Ruike
AU - Xia, Yinglong
AU - Zeng, Hanqing
AU - Tong, Hanghang
N1 - ZZ, RZ and HT are partially supported by NSF (1947135, 1939725, and 2134079), DARPA (HR001121C0165), NIFA (2020-67021-32799), DHS (17STQAC00001-06-00), and ARO (W911NF2110088). The content of the information in this document does not necessarily reflect the position or the policy of the Government or Amazon, and no official endorsement should be inferred. The U.S. Government is authorized to reproduce and distribute reprints for Government purposes notwithstanding any copyright notation here on.
PY - 2023
Y1 - 2023
N2 - Dictionary learning, which approximates data samples by a set of shared atoms, is a fundamental task in representation learning. However, dictionary learning over graphs, namely graph dictionary learning (GDL), is much more challenging than vectorial data as graphs lie in disparate metric spaces. The sparse literature on GDL formulates the problem from the reconstructive view and often learns linear graph embeddings with a high computational cost. In this paper, we propose a Fused Gromov-Wasserstein (FGW) Mixture Model named FRAME to address the GDL problem from the generative view. Equipped with the graph generation function based on the radial basis function kernel and FGW distance, FRAME generates nonlinear embedding spaces, which, as we theoretically proved, provide a good approximation of the original graph spaces. A fast solution is further proposed on top of the expectation-maximization algorithm with guaranteed convergence. Extensive experiments demonstrate the effectiveness of the obtained node and graph embeddings, and our algorithm achieves significant improvements over the state-of-the-art methods.
AB - Dictionary learning, which approximates data samples by a set of shared atoms, is a fundamental task in representation learning. However, dictionary learning over graphs, namely graph dictionary learning (GDL), is much more challenging than vectorial data as graphs lie in disparate metric spaces. The sparse literature on GDL formulates the problem from the reconstructive view and often learns linear graph embeddings with a high computational cost. In this paper, we propose a Fused Gromov-Wasserstein (FGW) Mixture Model named FRAME to address the GDL problem from the generative view. Equipped with the graph generation function based on the radial basis function kernel and FGW distance, FRAME generates nonlinear embedding spaces, which, as we theoretically proved, provide a good approximation of the original graph spaces. A fast solution is further proposed on top of the expectation-maximization algorithm with guaranteed convergence. Extensive experiments demonstrate the effectiveness of the obtained node and graph embeddings, and our algorithm achieves significant improvements over the state-of-the-art methods.
UR - http://www.scopus.com/inward/record.url?scp=85174401088&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85174401088&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:85174401088
SN - 2640-3498
VL - 202
SP - 40749
EP - 40769
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 40th International Conference on Machine Learning, ICML 2023
Y2 - 23 July 2023 through 29 July 2023
ER -