Graph Adaptive Semantic Transfer for Cross-domain Sentiment Classification
release_kgzhmx5y3bbxjnaragigbsd3tq
by
Kai Zhang, Qi Liu, Zhenya Huang, Mingyue Cheng, Kun Zhang, Mengdi Zhang, Wei Wu, Enhong Chen
2022
Abstract
Cross-domain sentiment classification (CDSC) aims to use the transferable
semantics learned from the source domain to predict the sentiment of reviews in
the unlabeled target domain. Existing studies in this task attach more
attention to the sequence modeling of sentences while largely ignoring the rich
domain-invariant semantics embedded in graph structures (i.e., the
part-of-speech tags and dependency relations). As an important aspect of
exploring characteristics of language comprehension, adaptive graph
representations have played an essential role in recent years. To this end, in
the paper, we aim to explore the possibility of learning invariant semantic
features from graph-like structures in CDSC. Specifically, we present Graph
Adaptive Semantic Transfer (GAST) model, an adaptive syntactic graph embedding
method that is able to learn domain-invariant semantics from both word
sequences and syntactic graphs. More specifically, we first raise a
POS-Transformer module to extract sequential semantic features from the word
sequences as well as the part-of-speech tags. Then, we design a Hybrid Graph
Attention (HGAT) module to generate syntax-based semantic features by
considering the transferable dependency relations. Finally, we devise an
Integrated aDaptive Strategy (IDS) to guide the joint learning process of both
modules. Extensive experiments on four public datasets indicate that GAST
achieves comparable effectiveness to a range of state-of-the-art models.
In text/plain
format
Archived Files and Locations
application/pdf 2.0 MB
file_s7qalc5pgvd7jm5aqswisq5ps4
|
arxiv.org (repository) web.archive.org (webarchive) |
2205.08772v1
access all versions, variants, and formats of this works (eg, pre-prints)