Improving Question Generation with Sentence-level Semantic Matching and Answer Position Inferring release_kbk6xqjqyzbv5aseq6mc3e4ike

by Xiyao Ma, Qile Zhu, Yanlin Zhou, Xiaolin Li, Dapeng Wu

Released as a article .

2019  

Abstract

Taking an answer and its context as input, sequence-to-sequence models have made considerable progress on question generation. However, we observe that these approaches often generate wrong question words or keywords and copy answer-irrelevant words from the input. We believe that lacking global question semantics and exploiting answer position-awareness not well are the key root causes. In this paper, we propose a neural question generation model with two concrete modules: sentence-level semantic matching and answer position inferring. Further, we enhance the initial state of the decoder by leveraging the answer-aware gated fusion mechanism. Experimental results demonstrate that our model outperforms the state-of-the-art (SOTA) models on SQuAD and MARCO datasets. Owing to its generality, our work also improves the existing models significantly.
In text/plain format

Archived Files and Locations

application/pdf  354.6 kB
file_tpmkxibkvfgezfuzgluuluy6gm
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2019-12-02
Version   v1
Language   en ?
arXiv  1912.00879v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: c8a48c58-5016-4989-ac5d-acc16ab66f36
API URL: JSON