Strong Baselines for Neural Semi-supervised Learning under Domain Shift release_lplej4gewrbn3c4ogpop3hrqt4

by Sebastian Ruder, Barbara Plank

Released as a article .

2018  

Abstract

Novel neural models have been proposed in recent years for learning under domain shift. Most models, however, only evaluate on a single task, on proprietary datasets, or compare to weak baselines, which makes comparison of models difficult. In this paper, we re-evaluate classic general-purpose bootstrapping approaches in the context of neural networks under domain shifts vs. recent neural approaches and propose a novel multi-task tri-training method that reduces the time and space complexity of classic tri-training. Extensive experiments on two benchmarks are negative: while our novel method establishes a new state-of-the-art for sentiment analysis, it does not fare consistently the best. More importantly, we arrive at the somewhat surprising conclusion that classic tri-training, with some additions, outperforms the state of the art. We conclude that classic approaches constitute an important and strong baseline.
In text/plain format

Archived Files and Locations

application/pdf  503.0 kB
file_2zpecccuw5du7mq6kodc2ilgpy
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2018-04-25
Version   v1
Language   en ?
arXiv  1804.09530v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: e84125aa-037f-4e0a-83de-7ae0285bbaad
API URL: JSON