TY - JOUR
T1 - Multi-relational graph convolutional networks
T2 - Generalization guarantees and experiments
AU - Li, Xutao
AU - Ng, Michael K.
AU - Xu, Guangning
AU - Yip, Andy
N1 - This work is supported by Hong Kong Research Grant Council under grants GRF 12300218 , GRF 12300519 , GRF 17201020 , GRF 17300021 , CRF C1013-21GF , CRF C7004-21GF , Joint NSFC-RGC N-HKU76921 and by NSFC, China under grant 61972111 .
Publisher Copyright:
© 2023 Elsevier Ltd
PY - 2023/4
Y1 - 2023/4
N2 - The class of multi-relational graph convolutional networks (MRGCNs) is a recent extension of standard graph convolutional networks (GCNs) to handle heterogenous graphs with multiple types of relationships. MRGCNs have been shown to yield results superior than traditional GCNs in various machine learning tasks. The key idea is to introduce a new kind of convolution operated on tensors that can effectively exploit correlations exhibited in multiple relationships. The main objective of this paper is to analyze the algorithmic stability and generalization guarantees of MRGCNs to confirm the usefulness of MRGCNs. Our contributions are of three folds. First, we develop a matrix representation of various tensor operations underneath MRGCNs to simplify the analysis significantly. Next, we prove the uniform stability of MRGCNs and deduce the convergence of the generalization gap to support the usefulness of MRGCNs. The analysis sheds lights on the design of MRGCNs, for instance, how the data should be scaled to achieve the uniform stability of the learning process. Finally, we provide experimental results to demonstrate the stability results.
AB - The class of multi-relational graph convolutional networks (MRGCNs) is a recent extension of standard graph convolutional networks (GCNs) to handle heterogenous graphs with multiple types of relationships. MRGCNs have been shown to yield results superior than traditional GCNs in various machine learning tasks. The key idea is to introduce a new kind of convolution operated on tensors that can effectively exploit correlations exhibited in multiple relationships. The main objective of this paper is to analyze the algorithmic stability and generalization guarantees of MRGCNs to confirm the usefulness of MRGCNs. Our contributions are of three folds. First, we develop a matrix representation of various tensor operations underneath MRGCNs to simplify the analysis significantly. Next, we prove the uniform stability of MRGCNs and deduce the convergence of the generalization gap to support the usefulness of MRGCNs. The analysis sheds lights on the design of MRGCNs, for instance, how the data should be scaled to achieve the uniform stability of the learning process. Finally, we provide experimental results to demonstrate the stability results.
KW - Algorithmic stability
KW - Generalization guarantees
KW - Graph convolutional networks
KW - Multi-relational data
UR - http://www.scopus.com/inward/record.url?scp=85150751374&partnerID=8YFLogxK
U2 - 10.1016/j.neunet.2023.01.044
DO - 10.1016/j.neunet.2023.01.044
M3 - Journal article
C2 - 36774871
AN - SCOPUS:85150751374
SN - 0893-6080
VL - 161
SP - 343
EP - 358
JO - Neural Networks
JF - Neural Networks
ER -