Studying Attention Models in Sentiment Attitude Extraction Task
release_2nj272sqjfgi3g6j7g5p4mdjzu
by
Nicolay Rusnachenko, Natalia Loukachevitch
2020
Abstract
In the sentiment attitude extraction task, the aim is to identify
<<attitudes>> -- sentiment relations between entities mentioned in text. In
this paper, we provide a study on attention-based context encoders in the
sentiment attitude extraction task. For this task, we adapt attentive context
encoders of two types: (i) feature-based; (ii) self-based. Our experiments with
a corpus of Russian analytical texts RuSentRel illustrate that the models
trained with attentive encoders outperform ones that were trained without them
and achieve 1.5-5.9% increase by F1. We also provide the analysis of attention
weight distributions in dependence on the term type.
In text/plain
format
Archived Files and Locations
application/pdf 1.3 MB
file_vz2b2hmydrfmbnnxcsix6xliei
|
arxiv.org (repository) web.archive.org (webarchive) |
2006.11605v1
access all versions, variants, and formats of this works (eg, pre-prints)