SOTAVerified

Walking Out of the Weisfeiler Leman Hierarchy: Graph Learning Beyond Message Passing

2021-02-17Code Available1· sign in to hype

Jan Tönshoff, Martin Ritzert, Hinrikus Wolf, Martin Grohe

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

We propose CRaWl, a novel neural network architecture for graph learning. Like graph neural networks, CRaWl layers update node features on a graph and thus can freely be combined or interleaved with GNN layers. Yet CRaWl operates fundamentally different from message passing graph neural networks. CRaWl layers extract and aggregate information on subgraphs appearing along random walks through a graph using 1D Convolutions. Thereby it detects long range interactions and computes non-local features. As the theoretical basis for our approach, we prove a theorem stating that the expressiveness of CRaWl is incomparable with that of the Weisfeiler Leman algorithm and hence with graph neural networks. That is, there are functions expressible by CRaWl, but not by GNNs and vice versa. This result extends to higher levels of the Weisfeiler Leman hierarchy and thus to higher-order GNNs. Empirically, we show that CRaWl matches state-of-the-art GNN architectures across a multitude of benchmark datasets for classification and regression on graphs.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
REDDIT-BCRaWlAccuracy93.15Unverified

Reproductions