SOTAVerified

Masked Label Prediction: Unified Message Passing Model for Semi-Supervised Classification

2020-09-08Code Available1· sign in to hype

Yunsheng Shi, Zhengjie Huang, Shikun Feng, Hui Zhong, Wenjin Wang, Yu Sun

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Graph neural network (GNN) and label propagation algorithm (LPA) are both message passing algorithms, which have achieved superior performance in semi-supervised classification. GNN performs feature propagation by a neural network to make predictions, while LPA uses label propagation across graph adjacency matrix to get results. However, there is still no effective way to directly combine these two kinds of algorithms. To address this issue, we propose a novel Unified Message Passaging Model (UniMP) that can incorporate feature and label propagation at both training and inference time. First, UniMP adopts a Graph Transformer network, taking feature embedding and label embedding as input information for propagation. Second, to train the network without overfitting in self-loop input label information, UniMP introduces a masked label prediction strategy, in which some percentage of input label information are masked at random, and then predicted. UniMP conceptually unifies feature propagation and label propagation and is empirically powerful. It obtains new state-of-the-art semi-supervised classification results in Open Graph Benchmark (OGB).

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ogbn-arxivUniMP_largeNumber of params1,162,515Unverified
ogbn-arxivUniMP_v2Number of params687,377Unverified
ogbn-arxivUniMPNumber of params473,489Unverified
ogbn-papers100MTransformerConvNumber of params883,378Unverified
ogbn-productsUniMPNumber of params1,475,605Unverified
ogbn-proteinsUniMP+CrossEdgeFeatNumber of params1,959,984Unverified
ogbn-proteinsUniMPNumber of params1,909,104Unverified

Reproductions