SOTAVerified

Neural Attentive Multiview Machines

2020-02-18Unverified0· sign in to hype

Oren Barkan, Ori Katz, Noam Koenigstein

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

An important problem in multiview representation learning is finding the optimal combination of views with respect to the specific task at hand. To this end, we introduce NAM: a Neural Attentive Multiview machine that learns multiview item representations and similarity by employing a novel attention mechanism. NAM harnesses multiple information sources and automatically quantifies their relevancy with respect to a supervised task. Finally, a very practical advantage of NAM is its robustness to the case of dataset with missing views. We demonstrate the effectiveness of NAM for the task of movies and app recommendations. Our evaluations indicate that NAM outperforms single view models as well as alternative multiview methods on item recommendations tasks, including cold-start scenarios.

Tasks

Reproductions