Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding
release_rr4ff4w24fdo5lyxkwsspx5lum
by
Qianying Liu, Fei Cheng, Sadao Kurohashi
2021
Abstract
Meta learning with auxiliary languages has demonstrated promising
improvements for cross-lingual natural language processing. However, previous
studies sample the meta-training and meta-testing data from the same language,
which limits the ability of the model for cross-lingual transfer. In this
paper, we propose XLA-MAML, which performs direct cross-lingual adaption in the
meta-learning stage. We conduct zero-shot and few-shot experiments on Natural
Language Inference and Question Answering. The experimental results demonstrate
the effectiveness of our method across different languages, tasks, and
pretrained models. We also give analysis on various cross-lingual specific
settings for meta-learning including sampling strategy and parallelism.
In text/plain
format
Archived Files and Locations
application/pdf 705.9 kB
file_7staqfrr7zendnm6yscurfpfzq
|
arxiv.org (repository) web.archive.org (webarchive) |
2111.05805v1
access all versions, variants, and formats of this works (eg, pre-prints)