Toward Better Storylines with Sentence-Level Language Models release_wbosyghiqzatjgn2wfrszj4m5a

by Daphne Ippolito, David Grangier, Douglas Eck, Chris Callison-Burch

Released as a article .

2020  

Abstract

We propose a sentence-level language model which selects the next sentence in a story from a finite set of fluent alternatives. Since it does not need to model fluency, the sentence-level language model can focus on longer range dependencies, which are crucial for multi-sentence coherence. Rather than dealing with individual words, our method treats the story so far as a list of pre-trained sentence embeddings and predicts an embedding for the next sentence, which is more efficient than predicting word embeddings. Notably this allows us to consider a large number of candidates for the next sentence during training. We demonstrate the effectiveness of our approach with state-of-the-art accuracy on the unsupervised Story Cloze task and with promising results on larger-scale next sentence prediction tasks.
In text/plain format

Archived Files and Locations

application/pdf  239.7 kB
file_cttf5j2rd5af3emvo65wro4mby
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2020-05-11
Version   v1
Language   en ?
arXiv  2005.05255v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 58a2b8bb-293a-4801-83f8-d69fd783fa93
API URL: JSON