Collaborative Memory Network for Recommendation Systems
release_uczy5bese5e3ha2m3wdlx5oklu
by
Travis Ebesu, Bin Shen, Yi Fang
2018
Abstract
Recommendation systems play a vital role to keep users engaged with
personalized content in modern online platforms. Deep learning has
revolutionized many research fields and there is a recent surge of interest in
applying it to collaborative filtering (CF). However, existing methods compose
deep learning architectures with the latent factor model ignoring a major class
of CF models, neighborhood or memory-based approaches. We propose Collaborative
Memory Networks (CMN), a deep architecture to unify the two classes of CF
models capitalizing on the strengths of the global structure of latent factor
model and local neighborhood-based structure in a nonlinear fashion. Motivated
by the success of Memory Networks, we fuse a memory component and neural
attention mechanism as the neighborhood component. The associative addressing
scheme with the user and item memories in the memory module encodes complex
user-item relations coupled with the neural attention mechanism to learn a
user-item specific neighborhood. Finally, the output module jointly exploits
the neighborhood with the user and item memories to produce the ranking score.
Stacking multiple memory modules together yield deeper architectures capturing
increasingly complex user-item relations. Furthermore, we show strong
connections between CMN components, memory networks and the three classes of CF
models. Comprehensive experimental results demonstrate the effectiveness of CMN
on three public datasets outperforming competitive baselines. Qualitative
visualization of the attention weights provide insight into the model's
recommendation process and suggest the presence of higher order interactions.
In text/plain
format
Archived Files and Locations
application/pdf 1.0 MB
file_m4esozv7vnhenjs2vyp2srcddy
|
arxiv.org (repository) web.archive.org (webarchive) |
1804.10862v2
access all versions, variants, and formats of this works (eg, pre-prints)