TY - JOUR
T1 - Multi-view knowledge graph fusion via knowledge-aware attentional graph neural network
AU - Huang, Zhichao
AU - Li, Xutao
AU - Ye, Yunming
AU - Zhang, Baoquan
AU - Xu, Guangning
AU - Gan, Wensheng
N1 - This research was supported in part by the National Key R&D Program of China, 2018YFB2101100, 2018YFB2101101 and NSFC under Grant No. 61972111 and U1836107.
Publisher Copyright:
© 2022, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2023/2
Y1 - 2023/2
N2 - Knowledge graphs (KGs) play a vital role in natural language processing (NLP), which can serve several downstream tasks. Because different views of KGs are usually constructed independently, the multi-view knowledge graph fusion (MVKGF) becomes a hotspot. Although multi-view learning studied very well in past decades, MVKGF is still not well tackled because of the heterogeneous relations and the multi-view KGs. To overcome MVKGF, entity alignment is the most studied. Existing entity alignment methods are dominated by embedding based methods, such as TransE and Graph Neural Networks (GNNs), where the alignment is achieved by measuring the similarities between entity embeddings. However, most previous approaches suffer from the issues of the diverse knowledge facts and the complex neighboring structures. In this paper, we propose a novel K nowledge-aware A ttentional G raph N eural N etwork (KAGNN) model to carefully incorporate both knowledge facts and neighboring structures. In particular, a knowledge-aware attention mechanism is designed to preserve the original semantics and determine the importance of each knowledge fact. Furthermore, a three-layered GCN with highway gates is adopted to learn better entity representations from the neighboring structure information. Thus, our model can be regarded as a multi-view extension of GNN. We validate our model on three cross-lingual datasets and the results show our model beats the state-of-the-art baselines by a large margin.
AB - Knowledge graphs (KGs) play a vital role in natural language processing (NLP), which can serve several downstream tasks. Because different views of KGs are usually constructed independently, the multi-view knowledge graph fusion (MVKGF) becomes a hotspot. Although multi-view learning studied very well in past decades, MVKGF is still not well tackled because of the heterogeneous relations and the multi-view KGs. To overcome MVKGF, entity alignment is the most studied. Existing entity alignment methods are dominated by embedding based methods, such as TransE and Graph Neural Networks (GNNs), where the alignment is achieved by measuring the similarities between entity embeddings. However, most previous approaches suffer from the issues of the diverse knowledge facts and the complex neighboring structures. In this paper, we propose a novel K nowledge-aware A ttentional G raph N eural N etwork (KAGNN) model to carefully incorporate both knowledge facts and neighboring structures. In particular, a knowledge-aware attention mechanism is designed to preserve the original semantics and determine the importance of each knowledge fact. Furthermore, a three-layered GCN with highway gates is adopted to learn better entity representations from the neighboring structure information. Thus, our model can be regarded as a multi-view extension of GNN. We validate our model on three cross-lingual datasets and the results show our model beats the state-of-the-art baselines by a large margin.
KW - Entity alignment
KW - Knowledge-aware attention
KW - Multi-view GNN
KW - Multi-view knowledge graph fusion
UR - https://www.scopus.com/pages/publications/85131313587
UR - https://link.springer.com/article/10.1007/s10489-022-03667-1
U2 - 10.1007/s10489-022-03667-1
DO - 10.1007/s10489-022-03667-1
M3 - Journal article
AN - SCOPUS:85131313587
SN - 0924-669X
VL - 53
SP - 3652
EP - 3671
JO - Applied Intelligence
JF - Applied Intelligence
IS - 4
ER -