SOTAVerified

Graph Entropy Minimization for Semi-supervised Node Classification

2023-05-31Code Available0· sign in to hype

Yi Luo, Guangchun Luo, Ke Qin, Aiguo Chen

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

Node classifiers are required to comprehensively reduce prediction errors, training resources, and inference latency in the industry. However, most graph neural networks (GNN) concentrate only on one or two of them. The compromised aspects thus are the shortest boards on the bucket, hindering their practical deployments for industrial-level tasks. This work proposes a novel semi-supervised learning method termed Graph Entropy Minimization (GEM) to resolve the three issues simultaneously. GEM benefits its one-hop aggregation from massive uncategorized nodes, making its prediction accuracy comparable to GNNs with two or more hops message passing. It can be decomposed to support stochastic training with mini-batches of independent edge samples, achieving extremely fast sampling and space-saving training. While its one-hop aggregation is faster in inference than deep GNNs, GEM can be further accelerated to an extreme by deriving a non-hop classifier via online knowledge distillation. Thus, GEM can be a handy choice for latency-restricted and error-sensitive services running on resource-constraint hardware. Code is available at https://github.com/cf020031308/GEM.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
CiteSeer with Public Split: fixed 20 nodes per classGEMAccuracy74.2Unverified
CiteSeer with Public Split: fixed 20 nodes per classOKDEEMAccuracy73.53Unverified
CiteSeer with Public Split: fixed 20 nodes per classEEMAccuracy72.63Unverified
Cora with Public Split: fixed 20 nodes per classGEMAccuracy83.05Unverified
PubMed with Public Split: fixed 20 nodes per classGraph-MLPAccuracy79.91Unverified
PubMed with Public Split: fixed 20 nodes per classGEMAccuracy78.48Unverified

Reproductions