TY - JOUR
T1 - Improving taxonomic relation learning via incorporating relation descriptions into word embeddings
AU - Huang, Subin
AU - Luo, Xiangfeng
AU - Huang, Jing
AU - Wang, Hao
AU - Gu, Shengwei
AU - GUO, Yi-Ke
N1 - Funding Information:
Ministry of Education of Humanities and Social Sciences Planning Fund of China, 18YJA630114; National Natural Science Foundation of China, 91746203; Natural Science Foundation of the Anhui Higher Education Institutions, KJ2017B18 Funding information
Funding Information:
The research reported in this paper was supported in part by the National Natural Science Foundation of China under the grant No. 91746203. This work was jointly supported by a grant from Ant Financial Services Group, the Ministry of Education of Humanities and Social Sciences Planning Fund of China under the grant No. 18YJA630114, and the Natural Science Foundation of the Anhui Higher Education Institutions under grant the No. KJ2017B18.
PY - 2020/7/25
Y1 - 2020/7/25
N2 - Taxonomic relations play an important role in various Natural Language Processing (NLP) tasks (eg, information extraction, question answering and knowledge inference). Existing approaches on embedding-based taxonomic relation learning mainly rely on the word embeddings trained using co-occurrence-based similarity learning. However, the performance of these approaches is not quite satisfactory due to the lack of sufficient taxonomic semantic knowledge within word embeddings. To solve this problem, we propose an improved embedding-based approach to learn taxonomic relations via incorporating relation descriptions into word embeddings. First, to capture additional taxonomic semantic knowledge, we train special word embeddings using not only co-occurrence information of words but also relation descriptions (eg, taxonomic seed relations and their contextual triples). Then, using the trained word embeddings as features, we employ two learning models to identify and predict taxonomic relations, namely, offset-based classification model and offset-based similarity model. Experimental results on four real-world domain datasets demonstrate that our proposed approach can capture additional taxonomic semantic knowledge and reduce dependence on the training dataset, outperforming the state-of-the-art compared approaches on the taxonomic relation learning task.
AB - Taxonomic relations play an important role in various Natural Language Processing (NLP) tasks (eg, information extraction, question answering and knowledge inference). Existing approaches on embedding-based taxonomic relation learning mainly rely on the word embeddings trained using co-occurrence-based similarity learning. However, the performance of these approaches is not quite satisfactory due to the lack of sufficient taxonomic semantic knowledge within word embeddings. To solve this problem, we propose an improved embedding-based approach to learn taxonomic relations via incorporating relation descriptions into word embeddings. First, to capture additional taxonomic semantic knowledge, we train special word embeddings using not only co-occurrence information of words but also relation descriptions (eg, taxonomic seed relations and their contextual triples). Then, using the trained word embeddings as features, we employ two learning models to identify and predict taxonomic relations, namely, offset-based classification model and offset-based similarity model. Experimental results on four real-world domain datasets demonstrate that our proposed approach can capture additional taxonomic semantic knowledge and reduce dependence on the training dataset, outperforming the state-of-the-art compared approaches on the taxonomic relation learning task.
KW - relation description
KW - taxonomic relation learning
KW - taxonomic semantic knowledge
KW - word embedding
UR - http://www.scopus.com/inward/record.url?scp=85082327829&partnerID=8YFLogxK
U2 - 10.1002/cpe.5696
DO - 10.1002/cpe.5696
M3 - Journal article
AN - SCOPUS:85082327829
SN - 1532-0626
VL - 32
JO - Concurrency Computation Practice and Experience
JF - Concurrency Computation Practice and Experience
IS - 14
M1 - e5696
ER -