Recursive Neural Language Architecture for Tag Prediction
release_m55vhpse4zb57g564wrdawxgoy
by
Saurabh Kataria
2016
Abstract
We consider the problem of learning distributed representations for tags from
their associated content for the task of tag recommendation. Considering
tagging information is usually very sparse, effective learning from content and
tag association is very crucial and challenging task. Recently, various neural
representation learning models such as WSABIE and its variants show promising
performance, mainly due to compact feature representations learned in a
semantic space. However, their capacity is limited by a linear compositional
approach for representing tags as sum of equal parts and hurt their
performance. In this work, we propose a neural feedback relevance model for
learning tag representations with weighted feature representations. Our
experiments on two widely used datasets show significant improvement for
quality of recommendations over various baselines.
In text/plain
format
Archived Files and Locations
application/pdf 570.1 kB
file_cb677cea2fhibp3rvuyg5pcp2m
|
arxiv.org (repository) web.archive.org (webarchive) |
1603.07646v1
access all versions, variants, and formats of this works (eg, pre-prints)