Cross-lingual Adaption Model-Agnostic Meta-Learning for Natural Language Understanding release_rr4ff4w24fdo5lyxkwsspx5lum

by Qianying Liu, Fei Cheng, Sadao Kurohashi

Released as a article .

2021  

Abstract

Meta learning with auxiliary languages has demonstrated promising improvements for cross-lingual natural language processing. However, previous studies sample the meta-training and meta-testing data from the same language, which limits the ability of the model for cross-lingual transfer. In this paper, we propose XLA-MAML, which performs direct cross-lingual adaption in the meta-learning stage. We conduct zero-shot and few-shot experiments on Natural Language Inference and Question Answering. The experimental results demonstrate the effectiveness of our method across different languages, tasks, and pretrained models. We also give analysis on various cross-lingual specific settings for meta-learning including sampling strategy and parallelism.
In text/plain format

Archived Files and Locations

application/pdf  705.9 kB
file_7staqfrr7zendnm6yscurfpfzq
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2021-11-10
Version   v1
Language   en ?
arXiv  2111.05805v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 3a1bb43b-5638-4daa-b0ac-42cc3a082cbc
API URL: JSON