Improving Question Generation with Sentence-level Semantic Matching and
Answer Position Inferring
release_kbk6xqjqyzbv5aseq6mc3e4ike
by
Xiyao Ma, Qile Zhu, Yanlin Zhou, Xiaolin Li, Dapeng Wu
2019
Abstract
Taking an answer and its context as input, sequence-to-sequence models have
made considerable progress on question generation. However, we observe that
these approaches often generate wrong question words or keywords and copy
answer-irrelevant words from the input. We believe that lacking global question
semantics and exploiting answer position-awareness not well are the key root
causes. In this paper, we propose a neural question generation model with two
concrete modules: sentence-level semantic matching and answer position
inferring. Further, we enhance the initial state of the decoder by leveraging
the answer-aware gated fusion mechanism. Experimental results demonstrate that
our model outperforms the state-of-the-art (SOTA) models on SQuAD and MARCO
datasets. Owing to its generality, our work also improves the existing models
significantly.
In text/plain
format
Archived Files and Locations
application/pdf 354.6 kB
file_tpmkxibkvfgezfuzgluuluy6gm
|
arxiv.org (repository) web.archive.org (webarchive) |
1912.00879v1
access all versions, variants, and formats of this works (eg, pre-prints)