Toward Better Storylines with Sentence-Level Language Models
release_wbosyghiqzatjgn2wfrszj4m5a
by
Daphne Ippolito, David Grangier, Douglas Eck, Chris Callison-Burch
2020
Abstract
We propose a sentence-level language model which selects the next sentence in
a story from a finite set of fluent alternatives. Since it does not need to
model fluency, the sentence-level language model can focus on longer range
dependencies, which are crucial for multi-sentence coherence. Rather than
dealing with individual words, our method treats the story so far as a list of
pre-trained sentence embeddings and predicts an embedding for the next
sentence, which is more efficient than predicting word embeddings. Notably this
allows us to consider a large number of candidates for the next sentence during
training. We demonstrate the effectiveness of our approach with
state-of-the-art accuracy on the unsupervised Story Cloze task and with
promising results on larger-scale next sentence prediction tasks.
In text/plain
format
Archived Files and Locations
application/pdf 239.7 kB
file_cttf5j2rd5af3emvo65wro4mby
|
arxiv.org (repository) web.archive.org (webarchive) |
2005.05255v1
access all versions, variants, and formats of this works (eg, pre-prints)