Multiple premises entailment recognition based on attention and gate mechanism

Pin Wu*, Zhidan Lei, Quan Zhou, Rukang Zhu, Xuting Chang, Junwu Sun, Wenjie Zhang, Yi-Ke GUO

*Corresponding author for this work

    Research output: Contribution to journalJournal articlepeer-review

    4 Citations (Scopus)


    Multi-premise natural language inference provides important technical support for automatic question answering, machine reading comprehension and other application fields. Existing approaches for Multiple Premises Entailment (MPE) task are to convert MPE data into Single Premise Entailment (SPE) data format, then MPE is handled in the same way as SPE. This process ignores the unique characteristics of multi-premise, which will result in loss of semantics. This paper proposes a mechanism based on Attention and Gate Fusion Network (AGNet). AGNet adopts a “Local Matching-Integration” strategy to consider the characteristics of multi-premise. In this process, an attention mechanism combined with a matching gate mechanism can fully describe the relationship between the premise and hypothesis. A self-attention mechanism and a fusion gate mechanism can deeply exploit the relationship from the multi-premise. In order to avoid over-fitting problem, we propose a pre-training method for our model. In terms of computational complexity, AGNet has good parallelism, reduces the time complexity to O(1) in the process of matching. The experiments show that our model has achieved new state-of-the-art results on MPE test set.

    Original languageEnglish
    Article number113214
    JournalExpert Systems with Applications
    Publication statusPublished - 1 Jun 2020

    Scopus Subject Areas

    • Engineering(all)
    • Computer Science Applications
    • Artificial Intelligence

    User-Defined Keywords

    • Attention mechanism
    • Fine-tune
    • Gate mechanism
    • Multiple premise entailment
    • Natural language inference


    Dive into the research topics of 'Multiple premises entailment recognition based on attention and gate mechanism'. Together they form a unique fingerprint.

    Cite this