Studying Attention Models in Sentiment Attitude Extraction Task release_2nj272sqjfgi3g6j7g5p4mdjzu

by Nicolay Rusnachenko, Natalia Loukachevitch

Released as a article .

2020  

Abstract

In the sentiment attitude extraction task, the aim is to identify <<attitudes>> -- sentiment relations between entities mentioned in text. In this paper, we provide a study on attention-based context encoders in the sentiment attitude extraction task. For this task, we adapt attentive context encoders of two types: (i) feature-based; (ii) self-based. Our experiments with a corpus of Russian analytical texts RuSentRel illustrate that the models trained with attentive encoders outperform ones that were trained without them and achieve 1.5-5.9% increase by F1. We also provide the analysis of attention weight distributions in dependence on the term type.
In text/plain format

Archived Files and Locations

application/pdf  1.3 MB
file_vz2b2hmydrfmbnnxcsix6xliei
arxiv.org (repository)
web.archive.org (webarchive)
Read Archived PDF
Preserved and Accessible
Type  article
Stage   submitted
Date   2020-06-20
Version   v1
Language   en ?
arXiv  2006.11605v1
Work Entity
access all versions, variants, and formats of this works (eg, pre-prints)
Catalog Record
Revision: cc8d8135-f317-4f8f-864d-f58f5b09092c
API URL: JSON