KERMIT: Knowledge graph completion of enhanced relation modeling with inverse transformation

Haotian Li, Bin Yu, Yuliang Wei, Kai Wang, Richard Yi Da Xu, Bailing Wang*

*Corresponding author for this work

Research output: Contribution to journalJournal articlepeer-review

Abstract

Knowledge graph completion (KGC) revolves around populating missing triples in a knowledge graph using available information. Text-based methods, which depend on textual descriptions of triples, often encounter difficulties when these descriptions lack sufficient information for accurate prediction, an issue inherent to the datasets and not easily resolved through modeling alone. To address this and ensure data consistency, we first use large language models (LLMs) to generate coherent descriptions, bridging the semantic gap between queries and answers. Secondly, we utilize inverse relations to create a symmetric graph, thereby providing augmented training samples for KGC. Additionally, we employ the label information inherent in knowledge graphs (KGs) to enhance the existing contrastive framework, making it fully supervised. These efforts have led to significant performance improvements on the WN18RR, FB15k-237 and UMLS datasets. According to standard evaluation metrics, our approach achieves a 3.0% improvement in Hit@1 on WN18RR and a 12.1% improvement in Hit@3 on UMLS, demonstrating superior performance.

Original languageEnglish
Article number113500
Number of pages14
JournalKnowledge-Based Systems
Volume324
Early online date22 May 2025
DOIs
Publication statusE-pub ahead of print - 22 May 2025

User-Defined Keywords

  • Knowledge graph completion (KGC)
  • Large language models (LLMs)
  • Supervised contrastive learning

Fingerprint

Dive into the research topics of 'KERMIT: Knowledge graph completion of enhanced relation modeling with inverse transformation'. Together they form a unique fingerprint.

Cite this