AnswerFact: Fact Checking in Product Question Answering

Wenxuan Zhang, Yang Deng, Jing Ma, Wai Lam

Research output: Chapter in book/report/conference proceedingConference proceeding


Product-related question answering platforms nowadays are widely employed in many E-commerce sites, providing a convenient way for potential customers to address their concerns during online shopping. However, the misinformation in the answers on those platforms poses unprecedented challenges for users to obtain reliable and truthful product information, which may even cause a commercial loss in E-commerce business. To tackle this issue, we investigate to predict the veracity of answers in this paper and introduce AnswerFact, a large scale fact checking dataset from product question answering forums. Each answer is accompanied by its veracity label and associated evidence sentences, providing a valuable testbed for evidence-based fact checking tasks in QA settings. We further propose a novel neural model with tailored evidence ranking components to handle the concerned answer veracity prediction problem. Extensive experiments are conducted with our proposed model and various existing fact checking methods, showing that our method outperforms all baselines on this task.
Original languageEnglish
Title of host publicationProceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, EMNLP 2020
EditorsBonnie Webber, Trevor Cohn, Yulan He, Yang Liu
PublisherAssociation for Computational Linguistics (ACL)
Number of pages11
ISBN (Print)9781952148606
Publication statusPublished - Nov 2020
EventThe 2020 Conference on Empirical Methods in Natural Language Processing - Virtual
Duration: 16 Nov 202020 Nov 2020


ConferenceThe 2020 Conference on Empirical Methods in Natural Language Processing
Abbreviated titleEMNLP 2020
Internet address


Dive into the research topics of 'AnswerFact: Fact Checking in Product Question Answering'. Together they form a unique fingerprint.

Cite this