SOTAVerified

Disentangled Self-Attentive Neural Networks for Click-Through Rate Prediction

2021-01-11Code Available0· sign in to hype

Yichen Xu, Yanqiao Zhu, Feng Yu, Qiang Liu, Shu Wu

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Click-Through Rate (CTR) prediction, whose aim is to predict the probability of whether a user will click on an item, is an essential task for many online applications. Due to the nature of data sparsity and high dimensionality of CTR prediction, a key to making effective prediction is to model high-order feature interaction. An efficient way to do this is to perform inner product of feature embeddings with self-attentive neural networks. To better model complex feature interaction, in this paper we propose a novel DisentanglEd Self-atTentIve NEtwork (DESTINE) framework for CTR prediction that explicitly decouples the computation of unary feature importance from pairwise interaction. Specifically, the unary term models the general importance of one feature on all other features, whereas the pairwise interaction term contributes to learning the pure impact for each feature pair. We conduct extensive experiments using two real-world benchmark datasets. The results show that DESTINE not only maintains computational efficiency but achieves consistent improvements over state-of-the-art baselines.

Tasks

Reproductions