Mixture-of-Partitions: Infusing Large Biomedical Knowledge Graphs into BERT release_3ksxrdc7ovdpxlfvxyol3wigli

by Zaiqiao Meng, Fangyu Liu, Thomas Hikaru Clark, Ehsan Shareghi, Nigel Collier

Released as a article .

2021  

Abstract

Infusing factual knowledge into pre-trained models is fundamental for many knowledge-intensive tasks. In this paper, we proposed Mixture-of-Partitions (MoP), an infusion approach that can handle a very large knowledge graph (KG) by partitioning it into smaller sub-graphs and infusing their specific knowledge into various BERT models using lightweight adapters. To leverage the overall factual knowledge for a target task, these sub-graph adapters are further fine-tuned along with the underlying BERT through a mixture layer. We evaluate our MoP with three biomedical BERTs (SciBERT, BioBERT, PubmedBERT) on six downstream tasks (inc. NLI, QA, Classification), and the results show that our MoP consistently enhances the underlying BERTs in task performance, and achieves new SOTA performances on five evaluated datasets.
In text/plain format

Archived Files and Locations

application/pdf  1.1 MB
file_sjlwjlqyaza2nmdzghkqdukokm
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2021-09-10
Version   v1
Language   en ?
arXiv  2109.04810v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: 4e3f6da6-1ac7-4a1c-89f9-ba1121e91857
API URL: JSON