SOTAVerified

Knowledge Distillation based Contextual Relevance Matching for E-commerce Product Search

2022-10-04Unverified0· sign in to hype

Ziyang Liu, Chaokun Wang, Hao Feng, Lingfei Wu, Liqun Yang

Unverified — Be the first to reproduce this paper.

Reproduce

Abstract

Online relevance matching is an essential task of e-commerce product search to boost the utility of search engines and ensure a smooth user experience. Previous work adopts either classical relevance matching models or Transformer-style models to address it. However, they ignore the inherent bipartite graph structures that are ubiquitous in e-commerce product search logs and are too inefficient to deploy online. In this paper, we design an efficient knowledge distillation framework for e-commerce relevance matching to integrate the respective advantages of Transformer-style models and classical relevance matching models. Especially for the core student model of the framework, we propose a novel method using k-order relevance modeling. The experimental results on large-scale real-world data (the size is 6174 million) show that the proposed method significantly improves the prediction accuracy in terms of human relevance judgment. We deploy our method to the anonymous online search platform. The A/B testing results show that our method significantly improves 5.7% of UV-value under price sort mode.

Tasks

Reproductions