Context-aware Decoder for Neural Machine Translation using a Target-side Document-Level Language Model release_oshtdoby35cwvps2igy42tqtxq

by Amane Sugiyama, Naoki Yoshinaga

Released as a article .

2020  

Abstract

Although many context-aware neural machine translation models have been proposed to incorporate contexts in translation, most of those models are trained end-to-end on parallel documents aligned in sentence-level. Because only a few domains (and language pairs) have such document-level parallel data, we cannot perform accurate context-aware translation in most domains. We therefore present a simple method to turn a sentence-level translation model into a context-aware model by incorporating a document-level language model into the decoder. Our context-aware decoder is built upon only a sentence-level parallel corpora and monolingual corpora; thus no document-level parallel data is needed. In a theoretical viewpoint, the core part of this work is the novel representation of contextual information using point-wise mutual information between context and the current sentence. We show the effectiveness of our approach in three language pairs, English to French, English to Russian, and Japanese to English, by evaluation in bleu and contrastive tests for context-aware translation.
In text/plain format

Archived Files and Locations

application/pdf  610.5 kB
file_lfed6hpqxnhnlcciciacvpxzo4
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2020-10-24
Version   v1
Language   en ?
arXiv  2010.12827v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 401348a3-e2c8-4634-8677-79506932d0f2
API URL: JSON