SOTAVerified

FDGATII : Fast Dynamic Graph Attention with Initial Residual and Identity Mapping

2021-10-21Code Available0· sign in to hype

Gayan K. Kulatilleke, Marius Portmann, Ryan Ko, Shekhar S. Chandra

Code Available — Be the first to reproduce this paper.

Reproduce

Code

Abstract

While Graph Neural Networks have gained popularity in multiple domains, graph-structured input remains a major challenge due to (a) over-smoothing, (b) noisy neighbours (heterophily), and (c) the suspended animation problem. To address all these problems simultaneously, we propose a novel graph neural network FDGATII, inspired by attention mechanism's ability to focus on selective information supplemented with two feature preserving mechanisms. FDGATII combines Initial Residuals and Identity Mapping with the more expressive dynamic self-attention to handle noise prevalent from the neighbourhoods in heterophilic data sets. By using sparse dynamic attention, FDGATII is inherently parallelizable in design, whist efficient in operation; thus theoretically able to scale to arbitrary graphs with ease. Our approach has been extensively evaluated on 7 datasets. We show that FDGATII outperforms GAT and GCN based benchmarks in accuracy and performance on fully supervised tasks, obtaining state-of-the-art results on Chameleon and Cornell datasets with zero domain-specific graph pre-processing, and demonstrate its versatility and fairness.

Tasks

Benchmark Results

DatasetModelMetricClaimedVerifiedStatus
ChameleonFDGATIIAccuracy65.18Unverified
Citeseer Full-supervisedFDGATIIAccuracy75.64Unverified
Cora Full-supervisedFDGATIIAccuracy87.79Unverified
CornellFDGATIIAccuracy82.43Unverified
Pubmed Full-supervisedFDGATIIAccuracy90.35Unverified
TexasFDGATIIAccuracy80.54Unverified
WisconsinFDGATIIAccuracy86.27Unverified

Reproductions